Friday, December 15, 2017

Guest Post: What the robots of 'Star Wars' tell us about automation


Paul Salmon, University of the Sunshine Coast

Millions of fans all over the world are eagerly anticipating this week’s release of Star Wars: The Last Jedi, the eighth in the series. At last we will get some answers to questions that have been vexing us since 2015’s The Force Awakens.

Throughout the franchise, the core characters have been accompanied by a number of much-loved robots, including C-3PO, R2-D2 and more recently, BB-8 and K2-SO. While often fulfilling the role of wise-cracking sidekicks, these and other robots also play an integral role in events.

Interestingly, they can also tell us useful things about automation, such as whether it poses dangers to us and whether robots will ever replace human workers entirely. In these films, we see the good, bad and ugly of robots - and can thus glean clues about what our technological future might look like.







The fear of replacement


One major fear is that robots and automation will replace us, despite work design principles that tell us technology should be used as a tool to assist, rather than replace, humans. In the world of Star Wars, robots (or droids as they are known) mostly assist organic lifeforms, rather than completely replace them.





R2-D2 and C3PO in A New Hope.
Lucasfilms/IMDB




So for instance, C-3PO is a protocol droid who was designed to assist in translation, customs and etiquette. R2-D2 and the franchise’s new darling, BB-8, are both “astromech droids” designed to assist in starship maintenance.

In the most recent movie, Rogue One, an offshoot of the main franchise, we were introduced to K2-SO, a wisecracking advanced autonomous military robot who was caught and reprogrammed to switch allegiance to the rebels. K2-SO mainly acts as a co-pilot, for example when flying a U-Wing with the pilot Cassian Andor to the planet of Eadu.

In most cases then, the Star Wars droids provide assistance – co-piloting ships, helping to fix things, and even serving drinks. In the world of these films, organic lifeforms are still relied upon for most skilled work.

When organic lifeforms are completely replaced, it is generally when the work is highly dangerous. For instance, during the duel between Annakin and Obi Wan on the planet Mustafar in Revenge of the Sith, DLC-13 mining droids can be seen going about their work in the planet’s hostile lava rivers.

Further, droid armies act as the frontline in various battles throughout the films. Perhaps, in the future, we will be OK with losing our jobs if the work in question poses a significant risk to our health.



K2-SO in Rogue One Lucasfilm IMDB Star Wars robots droids

K2-SO in Rogue One.
Lucasfilm/IMDB




However, there are some exceptions to this trend in the Star Wars universe. In the realm of healthcare, for instance, droids have fully replaced organic lifeforms. In The Empire Strikes Back a medical droid treats Luke Skywalker after his encounter with a Wampa, a yeti-like snow beast on the planet Hoth. The droid also replaces his hand following his battle with Darth Vadar on the planet Bespin.

Likewise, in Revenge of the Sith, a midwife droid is seen delivering the siblings Luke and Leia on Polis Massa.




Droids assist the birth of Luke and Leia Skywalker in Revenge of the Sith.



Perhaps this is one area in which Star Wars has it wrong: here on earth, full automation is a long way off in healthcare. Assistance from robots in healthcare is the more realistic prospect and is in fact, already here. Indeed, robots have been assisting surgeons in operating theatres for some time now.

Automated vehicles


Driverless vehicles are currently flavour of the month – but will we actually use them? In Star Wars, despite the capacity for spacecraft and star ships to be fully automated, organic lifeforms still take the controls. The spaceship Millenium Falcon, for example, is mostly flown by the smuggler Han Solo and his companion Chewbacca.

Most of the Star Wars starship fleet (A-Wings, X-Wings, Y-Wings, Tie Fighters, Star Destroyers, Starfighters and more) ostensibly possess the capacity for fully automated flight, however, they are mostly flown by organic lifeforms. In The Phantom Menace the locals on Tatooine have even taken to building and manually racing their own “pod racers”.

It seems likely that here on earth, humans too will continue to prefer to drive, fly, sail, and ride. Despite the ability to fully automate, most people will still want to be able to take full control.

Flawless, error proof robots?


Utopian visions often depict a future where sophisticated robots will perform highly skilled tasks, all but eradicating the costly errors that humans make. This is unlikely to be true.

A final message from the Star Wars universe is that the droids and advanced technologies are often far from perfect. In our own future, costly human errors may simply be replaced by robot designer errors.



R5-D4, the malfunctioning droid of A New Hope Star Wars

R5-D4, the malfunctioning droid of A New Hope.
Lucasfilms/IMDB




The B1 Battle Droids seen in the first and second Star Wars films lack intelligence and frequently malfunction. C-3PO is notoriously error prone and his probability-based estimates are often wide of the mark.

In the fourth film, A New Hope, R5-D4 (another astromech droid) malfunctions and explodes just as the farmer Owen Lars is about to buy it. Other droids are slow and clunky, such as the GNK Power droid and HURID-327, the groundskeeper at the castle of Maz Kanata in The Force Awakens.

The much feared scenario, whereby robots become so intelligent that they eventually take over, is hard to imagine with this lot.

The ConversationPerhaps the message from the Star Wars films is that we need to lower our expectations of robot capabilities, in the short term at least. Cars will still crash, mistakes will still be made, regardless of whether humans or robots are doing the work.

Paul Salmon, Professor of Human Factors, University of the Sunshine Coast

This article was originally published on The Conversation. Read the original article.

From the Archives: 'Cable Competition' (1991)

With all the talk about net neutrality these days, I was reminded of this letter to the editor from me published in the Chicago Tribune on September 11, 1991, about cable television monopolies:

Cable Competition
September 11, 1991|By Richard E. Sincere Jr.

ARLINGTON, VA. — Should cable TV monopolies be ended and competition take their place?

TV Guide cover Cable televisionThis seemed to be the question professor Tas Papathanasis asked in an Op-Ed article on Aug. 26 ("Inject competition into cable TV"). Unfortunately, Mr. Papathanasis does not deliver what he promises.

There is no reason, technical or economic, why cable TV companies cannot compete head-to-head in the same local market. Yet Mr. Papathanasis proposes a timid solution to the current problem of cable monopolies-make companies compete periodically for exclusive contracts to serve localities. In reality, this is no different than the current situation in most markets.

In those places where companies are allowed to compete in every sense of the word-Allentown, Pa., is one notable example-the price of basic cable service is kept low and consumers have more choices regarding the channels they receive and the services companies make available.

A temporary monopoly such as that proposed is still a monopoly; we consumers remain the losers. Once local governments allow competition among currently available services and once the federal government allows telephone companies to enter the cable TV business, consumers will be the big winners.


Thursday, December 14, 2017

Guest Post: Why Hollywood needs more films like 'Star Wars'

Neil Archer, Keele University

Try to imagine a world without Star Wars. It’s 40 years since George Lucas unveiled the first in his sci-fi franchise and, with The Last Jedi now upon us, it’s a question worth asking. A recent Vanity Fair article came to the conclusion that it’s almost impossible – unless, that is, we can imagine the past four decades without Space Invaders, Pixar, or even Photoshop.








In short, Star Wars is unavoidable. Since 1977, the Star Wars films have been the benchmark – if not the catalyst – for modern Hollywood’s “synergy-driven strategies” – linking big-screen outings with “ancillary products” in the form of action figures and other commercial tie-ins. Now owned by Disney, the Star Wars property extends to theme-park rides, videogames and, more recently, spin-off films and animated TV series.

Much like Marvel’s comic-book multiverse or the Harry “Potterverse”, the original Star Wars film – since rebranded as Episode IV: A New Hope – sits now as part of an endlessly proliferating set of episode and merchandising possibilities.

Some commentators on contemporary Hollywood bemoan this “conglomerate” logic and the types of movies it supposedly throws up. Star Wars inevitably becomes a whipping boy in this discussion, since the huge success of the 1977 film (only Gone with the Wind has sold more tickets at the box office) helped turn Hollywood away from films like The Godfather, Chinatown and Taxi Driver and taught it to rely on comic-book superheroes, literary wizards and the new worlds of CGI.





Star Wars Luke Skywalker The Last Jedi Disney Lucasfilm


Disney/Lucasfilm




Star Wars’ other lesson is that the “standalone” movie, especially in a modern movie climate where attracting an audience is never guaranteed, is too risky. Better to rely on established franchises, “spread” across multiple titles and media platforms. A quick glance at 2017’s US box office top ten, or the more-or-less identical UK list, illustrates this logic at work on a global scale.

The more positive spin on the franchise is its capacity for extending narrative in diverse and often richer ways; a process often known as “world building”. I’m fairly comfortable among Assembled Avengers or other Fantastic Beasts – and the dispersed, interweaving story-worlds they inhabit. But when this becomes the only possibility for large-scale filmmaking, even I find modern Hollywood’s dependency on the sequel constricting. Yet it’s odd if Star Wars (to stick with its original title) takes the rap for this, since it actually has so little in common with many of the franchise films that followed it – not to mention the extended Star Wars series itself.








Strangely enough, in fact, Star Wars offers positive lessons to filmmakers and producers hoping to change things up in the Hollywood game. Some French critics serendipitously use their word for UFO - “un ovni” (objet volant non identifi√©) - to describe films that, like Star Wars, seemed to come from nowhere. Given sci-fi’s box-office dominance over the past 40 years, it’s hard to imagine that in 1977 it was not yet a trusted form. Consequently, no one was sure what to do with Lucas’s film. His friends did not get it. Many of the hired production crew laughed at it. In its place, 20th Century Fox promoted forgotten squibs such as Damnation Alley. We eventually learned that Lucas was planning a saga and media empire all along. But in 1977, we just saw an out-of-the-blue epic.

Breaking the mould


Since Star Wars, the possibility of Hollywood losing money has been mitigated, paradoxically, by expensive “saturation” releases, with movies marketed intensively and opened simultaneously across thousands of screens. This wasn’t the case with Lucas’s cut-price film, which opened on just 32 of them. Contemporary Hollywood, critics might say, stifles choice. But in 1977, against the grain, audiences chose Star Wars.

Why? For my six-year-old self, Star Wars was both new and exotic. Technologically it broke new ground, pre-CGI, effectively ushering in the digital effects houses that are the engines of the modern industry. But it was also a film steeped in a cinematic past both strange and familiar: a past of Flash Gordon serials, John Ford Westerns and Kurosawa’s samurai films. It was set in a sumptuously designed galaxy far, far away – yet one that harked back to older war movies, scored to an old-fashioned orchestral soundtrack. Ironically, in drawing on so many films, Star Wars was like few other films made before, or even since.

The eventual problem with Lucas’s “prequel” trilogy, starting with 1999’s The Phantom Menace, was that it was no longer engaging with any other reference points beyond its own. Shot almost entirely against a CGI green screen, and loaded down by Lucas’s wordy script, the films felt dutiful but dull: they filled gaps in the saga, but related to nothing but the series itself.

I thrilled to 2015’s The Force Awakens, and its return to the series’ original cinematic values of adventure, pathos and wit, bolstered by the agile performances of its young, relatively unknown cast. Even here, though, as the film’s director and co-writer J J Abrams has admitted, the film was in many respects a reprise of everything good about the 1977 movie.








My point though is that we shouldn’t look to films that imitate Star Wars, but to those which follow its example. Like Lucas’s film, Damien Chazelle’s recent La La Land was a comparatively cheap picture out of step with its time and place – a one-off, non-adapted and (I hope) sequel-proof musical; one that started out screening in a few cinemas and festivals, but eventually turned into a worldwide hit.

The film’s nostalgic heart and look also owed to the cinematic past – to a Hollywood and Europe of the 1950s – yet it too seemed to emerge from its own gorgeously self-crafted world. It was proof that in Hollywood you don’t have to be totally original to be original. But you do need a bit of faith.

The ConversationAnd, as in 1977, it felt refreshingly new. But the lesson we might learn from La La Land is an old one: if the Hollywood that Star Wars helped build wants to do something new, it actually needs more films like Star Wars.

Neil Archer, Lecturer in Film Studies, Keele University

This article was originally published on The Conversation. Read the original article.

Guest Post: Did 'the man who invented Christmas' plagiarize Jesus?

Matthew Robert Anderson, Concordia University

Everyone knows the story of Scrooge, a man so miserly his name has become synonymous with penny-pinching meanness. Scrooge’s conversion from miser to benefactor has been told and retold since Charles Dickens first wrote A Christmas Carol in the fall and winter of 1843. Ebenezer is a wonderful character, so richly portrayed and fascinating he’s echoed in stories from The Grinch to It’s a Wonderful Life.

Dickens the man who invented christmas christopher plummerPop culture has embraced both Dickens and his tale. With this season’s The Man Who Invented Christmas, Hollywood has done it again.

But who was Scrooge before he was, well, Christopher Plummer? The inspiration for the crotchety Christmas-hater may have been those who put Dickens’ own father into debtor’s prison and were responsible for young Charles working in a shoe-blacking factory.

Some Dickens scholars believe the author’s 1843 visit to sooty Manchester, or to “the black streets of London,” (as he described them in a letter to a friend) influenced him. It may be that the fable was a moral reminder from Dickens to himself, as he teetered on financial ruin. This is the theory proposed in the book by Les Standiford on which this year’s movie is based.

Did Dickens in fact invent Christmas, as we know it? Hollywood may think so, but others, like David Parker in his Christmas and Charles Dickens vehemently disagree.

Whatever your opinion, the prevailing wisdom is that A Christmas Carol isn’t particularly religious. As a professor of biblical studies at Concordia University and also a Lutheran minister, I have a different reading.

It’s true that the celebration of the season which Scrooge discovers has much more to do with generosity, family gatherings and large cooked birds, than the Nativity. But maybe those seeking explicit scriptural references in Dickens’ story are underestimating the Victorian novelist’s skill — and his audacity. Perhaps A Christmas Carol contains an alternative to the Bible rather than a simple borrowing from it. And perhaps that’s the point.

Jesus was a master story-teller


Jesus, by all accounts another master story-teller, told a parable that, stripped of Dickens’ English waistcoats, ledgers, fog and shutters, could almost be a mirror to A Christmas Carol:

“There once was a rich man. A poor man named Lazarus lived at his gate, with nothing to eat. Lazarus died and was carried by the angels to Abraham’s side. The rich man also died.”

There follows, in Jesus’ tale, an exchange between the rich man, who is in torment, and Abraham, who acts as the guardian of paradise. It’s hard not to think of the innocent Lazarus as a precursor to Tiny Tim.

First the rich man asks for his own relief from hell. When that’s denied, he pleads: “I beg you, send Lazarus to my father’s house. I have five brothers. Let him warn them so they don’t come to this place of agony.” Abraham replies: “They have Moses and the prophets. They must listen to them.”

“No, Father Abraham!” cries the rich man, “But if someone from the dead goes to them, they will change” (Luke 16:19-31).

One can almost hear the chains of Morley’s ghost rattling. What would have happened if Father Abraham had said yes? Something very like a first-century version of A Christmas Carol.

Let’s not forget that the people of our western English-speaking past, especially artists and writers, were imbued with Biblical references and ideas. As Northrop Frye, among others, has argued, they lived and created in a world shaped by the rhythms, narratives, images and conceptions (or misconceptions) of the King James Bible.

Parable of the Rich Man and Lazarus
Parable of Lazarus
Was Dickens familiar with Christian scriptures? All evidence points to the fact that he was more acquainted than most. Despite an antipathy to organized religion, from 1846 to 1849 Dickens wrote a short biography of Jesus for his children, titled The Life of our Lord.

He forbade that his small retelling of Jesus’ life should be published, until not only he, but also his children, had died. The “Parable of Lazarus and the Rich Man” was one of eight stories of Jesus that Dickens chose to include in that volume. But in his story of Scrooge, Dickens was too much of a writer to leave Jesus’ parable as is, and his age too suspicious of scripture to leave it “unbroken.”

A Christmas Carol unites the deliciously horrific sensibility of the Gothic movement with the powerfully simple narrative style, joined to moral concern, typical of parables.

Was Dickens perhaps dozing off some Sunday while the rector droned on about Lazarus, until he wakened with a start dreaming of Scrooge? We will never know. But it’s an intriguing possibility.

Happy endings for the rich


Surprisingly, the Sunday after Dickens was buried in Westminster Abbey, Dean Arthur Penrhyn Stanley, preaching on exactly this text, spoke of Dickens as the “parabler” of his age. Stanley said that “By [Dickens] that veil was rent asunder which parts the various classes of society. Through his genius the rich man…was made to see and feel the presence of Lazarus at his gate.”

I would go further: Dickens took the parable, and then retold and changed it, so that the rich man gets a second chance. As a privileged societal figure who had gone through financial difficulties and who cared about the poor himself, Dickens freely adapted Jesus to come up with a story that’s ultimately more about love than judgement.

When confronted with Marley’s spectre, Scrooge, unnerved but unrepentant, addresses the apparition: “You may be an undigested bit of beef, a blot of mustard, a crumb of cheese, a fragment of underdone potato.”

The perceptive reader (or viewer) of A Christmas Carol can point a finger at Marley’s ghost and add: “Or maybe you’re an ironic but hope-filled riff on Jesus, by a famous nineteenth-century author who wanted to write his own story of redemption.”

The ConversationDickens not only invented this Christmas genre, but imagined a happy ending for himself in it. He penned an enduring story about the second chance even a rich person can receive, if haunted by persistent-enough ghosts.








The Man Who Invented Christmas (Bleeker Street Media/Elevation Pictures)



Matthew Robert Anderson, Affiliate Professor, Theological Studies, Loyola College for Diversity & Sustainability, Concordia University

This article was originally published on The Conversation. Read the original article.




Wednesday, December 13, 2017

From the Archives: 'School Training for Civil Defense' (1981)

Some background may shed light on this 1981 article, retrieved from my paper archives.  Fortunately the story behind it has already been told, in a remembrance of celebrated debate coach James J. Unger, which I posted in April 2008:

The high school debate topic the previous year (1981-82) was "Resolved: That the federal government should establish minimum educational standards for elementary and secondary schools in the United States." I came up with the idea, based upon research I was doing in the real world -- if the world of Washington think tanks can be described as "real" -- that we should write a case about civil defense education in elementary and secondary schools.

The problem with this idea was that there was little, if any, information available about civil defense education. (There was some material from the 1960s, but nothing recent and little that was usable by debaters.) But I was convinced this could be a winning case.

So I asked Professor Unger, "What do you do when something is topical but so obscure that there is nothing written about it that you can use as evidence for inherency?" He replied that there was not much to do in that situation, other than to intensify your research and find the evidence you need.

My solution: since I had already had one article published on the topic of civil defense -- appearing in the Washington Star on October 10, 1980, months after I submitted it and based on research I did during the summer 1980 forensics institute -- and had subsequently become an officer in the American Civil Defense Association, I could just write another one, with a focus on education, that could be used as evidence to support our case.

And that's what I did. I submitted the article to several newspapers, and it was published in the New York Tribune (a sister newspaper to The Washington Times), just days before the institute tournament. We inserted the appropriate quotations into the case (not citing me by name), held others in reserve for second affirmative and rebuttals, and moved forward.

The case was relatively successful, with two of my teams making it into the elimination rounds. After the last round that one of the teams lost, they told me that my qualifications as a source had become an issue in the debate. The judge from that round added: "Your boys defended you valiantly, but they lost on other issues."

This is a long tale meant to be background of something that happened a couple of years later. As it was told to me, late one night while preparing for a tournament, members of the Georgetown debate team had hit a brick wall, unable to find the evidence they needed to complete a brief they were working on. Professor Unger popped up and said, "Well, why don't we just pull a Rick Sincere?" -- meaning, why not write an article and get it published in a reputable newspaper or journal? I don't think they ever followed through on that suggestion, but just the idea that my name became associated with a new debate tactic was enough to warm my ego.

This article was published in The News World, a New York City daily newspaper (later called the New York City Tribune), on July 28, 1981:

Richard Sincere
School Training for Civil Defense

Perhaps no aspect of the strategic competition between the United States and the Soviet Union is ignored more than civil defense and emergency preparedness. Americans waste too much effort in debates which obfuscate strategic issues by statistical manipulation of throw-weights, megatonnages, and MIRV capabilities. Public and policymakers alike are blind to the reality of the strategic balance: Deterrence of nuclear war depends as much on the willingness and ability to survive such a conflict as it does on the technical capacity to fight the battle.

News World School Training for Civil Defense 1981
Soviet political and military policies do not reflect a frightened belief in the universal destruction of nuclear war. Instead, they maintain that nuclear weapons are instruments for war-fighting. In many ways, Soviet leaders view nuclear weapons as extensions of conventional war-fighting techniques; Soviet military literature categorizes war by who does the fighting, not by the weapons which they use. Most importantly, Soviet military strategy is fundamentally a survival-oriented strategy.

One result of this thinking has been the establishment of a nationwide civil defense network. The chief of Soviet civil defense is an army general, filling an office equivalent to our own secretary of the Army. The Soviets treat civil defense as a co-equal branch of the military. On the other hand, in the United States responsibility for civil defense lies buried in an obscure bureau of the Department of Commerce called the Federal Emergency Management Agency. Which country takes its self-protection more seriously?

In accord with the principle of protecting their people from the ravages of nuclear war, the Soviets have launched an extensive training program in all public schools from elementary to university levels, and as continuing education industrial plants and communities. Towns and villages celebrate “civil defense days” as holidays, with sports competitions and games geared toward teaching the citizens survival techniques. And if some Soviet citizens scoff at these methods, they will at least have some skills to draw on in an emergency.

Soviet Civil Defense
A widely-circulated Soviet civil defense manual state: “Civil defense training in the public schools occupies an important place in preparing the people of our country for protection against weapons of mass destruction.” In contrast, the editor of the Journal of Civil Defense told me recently that “civil defense education has been badly neglected in the United States in the past few years. With no initiative from the higher levels, it apparently has fallen off to almost zero.”

This attitude seems unlikely to change. The shame of this neglect is that civil defense survival methods are so easy to teach. Generally, Soviet schools spend no more than 15-20 minutes each week on it, mostly in conjunction with sportsmanlike competition. One civil defense game involves nearly 20 million children each summer. The final match of this game, called “Summer Lightning,” is played in Leningrad as an object of intense national interest.

In the United States, inaccessibility to civil defense literature is the greatest obstacle to survival training. A good beginning for civil defense instruction in America’s public schools would be for the Department of Education to sponsor distribution of survival handbooks (such as Dr. Cresson Kearny’s “Nuclear War Survival Skills,” published in 1979) to all school libraries. Such a minimum requirement would allow individual school districts to expand civil defense education as much as they like, especially if assistance from the Department of Defense and FEMA were available.

Civil defense education will immeasurably increase the maintenance of a peaceful deterrent to nuclear war. As long as no civil defense training is available to United States citizens, our country remains a willing hostage to Soviet weapons with little hope of survival or recovery. Survival plays a major role in Soviet strategy and plays almost no role in our own. To neglect such a vital aspect of the strategic nuclear balance is to assure our own destruction.

Richard Sincere is research assistant for church and society at the Ethics and Public Policy Center in Washington, D.C. A member of the American Civil Defense Association, he also holds a degree in international affairs from Georgetown University.

Subsequent to this and other newspaper articles on civil defense, I testified on the topic before a subcommittee of the House Armed Services Committee, discussed it on many television and radio shows, and published a journal article that was reprinted in pamphlet form by the Ethics and Public Policy Center, which included a foreword by actor Lorne Greene. It was a central focus of my professional life in the 1980s but faded into the background after I finished my master's degree at the LSE and the Cold War came to an end. Civil defense and nuclear weapons policy took a back seat to Africa policy.

As an added bonus, here is a 1950s-era government training (propaganda?) video about school-based civil defense education.


Despite the jokes about it, civil defense in the schools was much more than "duck and cover."





Tuesday, December 12, 2017

Guest Post: Hanukkah's true meaning is about Jewish survival


Alan Avery-Peck, College of the Holy Cross

Beginning on the evening of Dec. 12, Jews will celebrate the eight-day festival of Hanukkah, perhaps the best-known and certainly the most visible Jewish holiday.

While critics sometimes identify Christmas as promoting the prevalence in America today of what one might refer to as Hanukkah kitsch, this assessment misses the social and theological significance of Hanukkah within Judaism itself.

Let’s consider the origin and development of Hanukkah over the past more than 2,000 years.

Early history


Though it is 2,200 years old, Hanukkah is one of Judaism’s newest holidays, an annual Jewish celebration that does not even appear in the Hebrew Bible.

The historical event that is the basis for Hanukkah is told, rather, in the post-biblical Books of the Maccabees, which appear in the Catholic biblical canon but are not even considered part of the Bible by Jews and most Protestant denominations.




Maccabees Hanukkah Chanukah Jewish history

The Maccabees receive their father’s blessing.
The Story of the Bible from Genesis to Revelation via Wikimedia Commons.




Based on the Greco-Roman model of celebrating a military triumph, Hanukkah was instituted in 164 B.C. to celebrate the victory of the Maccabees, a ragtag army of Jews, against the much more powerful army of King Antiochus IV of Syria.

In 168 B.C., Antiochus outlawed Jewish practice and forced Jews to adopt pagan rituals and assimilate into Greek culture.

The Maccabees revolted against this persecution. They captured Jerusalem from Antiochus’s control, removed from the Jerusalem Temple symbols of pagan worship that Antiochus had introduced and restarted the sacrificial worship, ordained by God in the Hebrew Bible, that Antiochus had violated.

Hanukkah, meaning “dedication,” marked this military victory
with a celebration that lasted eight days and was modeled on the festival of Tabernacles (Sukkot) that had been banned by Antiochus.

How Hanukkah evolved


The military triumph, however, was short-lived. The Maccabees’ descendants – the Hasmonean dynasty – routinely violated their own Jewish law and tradition.

Even more significantly, the following centuries witnessed the devastation that would be caused when Jews tried again to accomplish what the Maccabees had done. By now, Rome controlled the land of Israel. In A.D. 68-70 and again in A.D. 133-135, the Jews mounted passionate revolts to rid their land of this foreign and oppressing power.




Second Temple Jerusalem Jewish history Hanukkah

The destruction of the Temple of Jerusalem.
Francesco Hayez, via Wikimedia Commons




The first of these revolts ended in the destruction of the Second Jerusalem Temple, the preeminent center of Jewish worship, which had stood for 600 years. As a result of the second revolt, the Jewish homeland was devastated and countless Jews were put to death.

War no longer seemed an effective solution to the Jews’ tribulations on the stage of history.

In response, a new ideology deemphasized the idea that Jews should or could change their destiny through military action. What was required, rabbis asserted, was not battle but perfect observance of God’s moral and ritual law. This would lead to God’s intervention in history to restore the Jewish people’s control over their own land and destiny.

In this context, rabbis rethought Hanukkah’s origins as the celebration of a military victory. Instead, they said, Hanukkah should be seen as commemorating a miracle that occurred during the Maccabees’ rededication of the temple: The story now told was how a jar of temple oil sufficient for only one day had sustained the temple’s eternal lamp for a full eight days, until additional ritually appropriate oil could be produced.

The earliest version of this story appears in the Talmud, in a document completed in the sixth century A.D. From that period on, rather than directly commemorating the Maccabees’ victory, Hanukkah celebrated God’s miracle.

This is symbolized by the kindling of an eight-branched candelabra (“Menorah” or “Hanukkiah”), with one candle lit on the holiday’s first night and an additional candle added each night until, on the final night of the festival, all eight branches are lit. The ninth candle in the Hanukkiah is used to light the others.

Throughout the medieval period, however, Hanukkah remained a minor Jewish festival.

What Hanukkah means today


How then to understand what happened to Hanukkah in the past hundred years, during which it has achieved prominence in Jewish life, both in America and around the world?




Hanukkah menorah candles holidays Christmas Chanukah

Hanukkah today responds to Jews’ desire to see their history as consequential.
Pixabay.com/en, CC BY




The point is that even as the holiday’s prior iterations reflected the distinctive needs of successive ages, so Jews today have reinterpreted Hanukkah in light of contemporary circumstances – a point that is detailed in religion scholar Dianne Ashton’s book, “Hanukkah in America.”

Ashton demonstrates while Hanukkah has evolved in tandem with the extravagance of the American Christmas season, there is much more to this story.

Hanukkah today responds to Jews’ desire to see their history as consequential, as reflecting the value of religious freedom that Jews share with all other Americans. Hanukkah, with its bright decorations, songs, and family- and community-focused celebrations, also fulfills American Jews’ need to reengage disaffected Jews and to keep Jewish children excited about Judaism.

Poignantly, telling a story of persecution and then redemption, Hanukkah today provides a historical paradigm that can help modern Jews think about the Holocaust and the emergence of Zionism.

In short, Hanukkah is as powerful a commemoration as it is today because it responds to a host of factors pertinent to contemporary Jewish history and life.

The ConversationOver two millennia, Hanukkah has evolved to narrate the story of the Maccabees in ways that meet the distinctive needs of successive generations of Jews. Each generation tells the story as it needs to hear it, in response to the eternal values of Judaism but also as is appropriate to each period’s distinctive cultural forces, ideologies and experiences.

Alan Avery-Peck, Kraft-Hiatt Professor in Judaic Studies, College of the Holy Cross

This article was originally published on The Conversation. Read the original article.

Monday, December 11, 2017

From the Archives - Dick Armey on the U.S. Congress: 'the most dangerous gang of economic illiterates I've ever seen'

Dick Armey on the U.S. Congress: 'the most dangerous gang of economic illiterates I've ever seen'
September 15, 2010 2:52 AM MST

Dick ArmeyAccording to its co-author, former Texas Congressman and House Majority Leader Dick Armey, the new book, Give Us Liberty: A Tea Party Manifesto, came about in response to mean-spirited attacks on Tea Party participants.

“We were sitting around looking at these horrible, mean ways in which these good folks were being characterized,” Armey related, “and we said, ‘Somebody needs to tell the whole story, the true story.’”

Tea Party’s ‘true story’
That “true story,” Armey explained to the Charlottesville Libertarian Examiner, was that, even before the movement had a name, Tea Party activists from around the country sought advice from FreedomWorks, the advocacy group that Armey chairs.

“Almost without exception,” he said, “wherever you look in the country -- California, Florida, wherever -- where somebody wanted to put a group together and start getting the ball rolling, they called us.”

Armey answered questions about his book, the Tea Party, and the 2010 and 2012 elections in an interview on the eve of the second 9/12 Taxpayer March on Washington, which this year attracted a crowd of 100,000 or more protesters who gathered on the West Front of the U.S. Capitol to hear a range of speakers from Colombian immigrant Tito Munoz to former New Mexico Governor Gary Johnson to Virginia Attorney General Ken Cuccinelli.

Understanding Economics

Dick Armey Congress economics
The average Tea Party member, Armey agreed, has a better grasp of economics than the average Member of Congress.

“No doubt about it,” he said. “That’s one of the things that really distresses me.”

The country is in trouble, he added, if Congress Members' “understanding of economics, how the economy works, the world of commerce, where the money comes from, is less than” that of the typical citizen.

“This is a serious problem and I have no doubt about it,” Armey said with emphasis.

“You take a look at the leadership in the House and the Senate and the Executive Branch of government, starting with the President, it is the most dangerous gang of economic illiterates I’ve ever seen in my life.”

No ‘honest curiosity’
Having served in Congress for nearly two decades, Armey had observed the capacity of its Members to understand basic economic concepts.

“It is frightening,” he said. “They don’t have an honest curiosity about economics. I just don’t believe any one of them ever looked at this and asked the question, where’s this money coming from?”

Armey described legislators as “a bunch of kids that found the money tree” who say, “’We can just spend all we want.’”


Publisher's note: This article was originally published on Examiner.com on September 15, 2010. The Examiner.com publishing platform was discontinued July 1, 2016, and its web site went dark on or about July 10, 2016.  I am republishing this piece in an effort to preserve it and all my other contributions to Examiner.com since April 6, 2010. It is reposted here without most of the internal links that were in the original.