‘The Death of the Author,’ – Barthes – A Discussion

In his piece, ‘The Death of the Author,’ Roland Barthes criticises analyses of texts which seek to ‘decipher’ meaning through knowledge of their respective authors. To support his claim, he explores the birth of the figure known as the ‘author’ in literature, its significance, those writers that have attempted to remove themselves from their works, and what a text is without its creator.

Barthes finds the restrictive presence of the author in literary analyses to be a relatively modern phenomenon – ‘…produced, no doubt…at the end of the Middle Ages, with English empiricism, French rationalism, and the personal faith of the Reformation...’ Exploring previous ways of storytelling, spanning cultures and eras, he applauds tales told through ‘mediators’ of a kind, ‘shamans’ without personal connection or overt influence upon the narrative, praised at times for the quality of their delivery, but not for any part in its creation. For me, (perhaps slightly ironically, reading this), I was struck by the role of prophets and religious figures who record the wills of their Gods. The position of the prophet  on the blurred line between impartial ‘mediator’ and religious figure or actor in their own right draws me, purely for its potential to, at the root of societies, not just theocracies, change everything. The level of awareness of author with which one reads a religious text varies, but they naturally can be analytically read. In a similar fashion, oral tradition results in often profound shifts in the narrative over time, the reasons for which I am inclined to believe are not always incidental, instead the product of bias, suspicion, and personal motivations. All in all, although the ‘birth’ (a reader’s vested interest and awareness of) ‘of an author’ can be seen, as Barthes asserts, to have taken place in the late Middle Ages, the power of the author, and awareness of such, to change narrative meaning came into being at a much earlier date.

Barthes recounts notorious examples of artists from whose personal lives we fail to abstract their work – Baudelaire, Van Gogh, but he holds these up merely as examples of the inevitable end for art which is interpreted in this manner. He expounds, however, upon the potential for analysis of the art of those who have tried consciously to escape such treatment. He mentions Mallarme first, whose methods ‘…restore the status of the reader…’  Limitations concerning time are put in place, Barthes states, when one includes the author as such a strong presence in the work; works are given a new and unnecessary expiry date, tied to their author, ‘(their) past…father…with his child.’ But writers attempting, in a similar vein to Mallarme, to give sovereignty to the reader, in doing so make the messages of their texts accessible, and fluid, over decades and even centuries.

This fluidity of interpretation, then, is explored further in the next paragraph in what, upon first reading, seemed to me to be a counter-argument of Barthes’. ‘…the text is a tissue of citations, resulting from the thousand sources of culture.’ This suggests, surely, that a richness of understanding can be the result of interest in the creator, their circumstances and time, all that which influenced them (I was thinking Nietzsche, and the social upheaval of the foundation of the German Empire during his lifetime.) The idea does appeal to me, the depth of this unconscious exposition a writer, according to such a theory, would be putting across of the world in its entirety up until the point of writing, all things in some way being shaped by what has come before them. However, Barthes sees, once more, limitations in this nevertheless romantic theory, and argues almost for an extension of this, the artwork taking on new meanings ‘succeeding the author,’ the ‘signs’ contained within it growing ‘infinitely remote.’ In short, Barthes recognises significance of words beyond ‘the vulgar patience of purely literary themes,’ and validates changing significance also, empowering the reader once more. This reader is, as Barthes draws to his conclusion, the only person who can view the collected ‘multiplicity’ of the work in front of them, ‘classical criticisms’ aside, and grasp the particular converging aspects as a whole.

All in all, having read ‘The Death of an Author,’ I would have to say that Barthes channels his argument, beginning in its slightly combative tone and ending with fine unpicking of the point at which the two opposing views of literature meet, before their paths diverge again, forever. In fact, one of these ends, that centred upon the author – it is only in the validation of new meanings for texts that the path Barthes advocates can be sustained. The reason I can accept this argument at all, however, is his late appreciation of the mutual meeting point of the two, this channelled argument. If I’m wrong about this, I suppose Barthes would have to uphold my take. That would be interesting to see.

Forgotten Deities, and the Lifespan of Religious Knowledge Systems

The religion of Ashurism was founded sometime around the 18th century BC, and lasted until the 5th century BC, when the country of Assyria was destroyed, when the Medes and the Babylonians rebelled against them. It is likely that the survivors worshipped secretly for a while afterwards, but likely publicly converted, and eventually the religion was lost over generations. Ashurism itself is incredibly similar to the Babylonian religion, in that it was polytheistic and centred around almost all of the same Gods – the key difference being which they placed most value upon – in Ashurism, it is naturally our star Ashur.

Ashur was worshipped in Northern Mesopotamia (NW Iraq, NE Syria, and a little bit of Turkey.) In the 4th century BC, he gains the wife of Enlil, a God of the same position as Ashur in an alternative strand of Mesopotamian religion. Ashur was a deified form of the city of Assur, which is evident from around the third millennium BC. In their creation story,  he slays the chaos monster, Tiamat, and establishes the world. He is utilised further in the city as a warlike, vigorous presence, granting success in battle. Otherwise, Britannica suggests that he has little character, merely serving as personification of the city of Ashur’s success in warfare.

This, logically, then, is the purpose of belief in Ashur – it serves the conviction that the city’s successes were fated and assured, different from the fleeting conquests of other peoples and settlements. People’s faith in the leader’s battlefield prowess was assured, and the population was unified in this. Unfortunately, it only served this purpose until the fall of Ashur, upon which the aforementioned gradual conversion to Christianity occurred. Upon investigation for the obligatory question, “Does anybody believe in this today?” I believe the answer, from all mildly reputable sources (“The Assyrian International News Agency” and Britannica) was “no.” There was one man on Reddit, but it didn’t seem that the 21st century believer’s system was very similar to the ancient. With that cleared up, I can assure you that Ashurism is no more.

Does this principle always hold true? That, with the purpose of a given religion removed, it collapses, faith wanes? Have the religions that we have with us today negotiated this rule in such a way as to make it thus far? With this being my belief, will there come a time when they can no longer negotiate this ‘rule of purpose,’ with no Earthly assertion of success or non-linear criteria?

An example for analysis: Christianity. Whilst the determining factor for justification of belief in the Ashurist gods was their delivery of help in some kind if you did the right thing, it is less clear-cut in Christianity. Eventually the Ashurists met an enemy (the Medes and the Christians) who defeated them, no matter their skill in battle, perceived to have been aided by the gods, likely improved by the previous belief that they were infallible. Repeated failure, despite following religious instruction, eventually proved to all but the most faithful that the Gods had no hand in the events that had befallen them.

The principle I am comparing in Christianity is the belief that righteous behaviour will allow a person to go to heaven, and that those actions certainly do have consequence – they are noted by a God. There is, here, a burden of proof as there is for Ashurism, but the delivery of this is crucially after death, and with benefits for many in their life on Earth, belief falls into place regardless. Christianity has thus successfully negotiated a pitfall in religious logic and planning, which Ashurism did not, perhaps due to true conviction. It is my belief that without being able to definitively disprove Christianity, or offer a truth ourselves, Christianity will continue indefinitely as a world religion, in part also due to the benefits believers are afforded in their daily lives. Ashurism’s role as organiser and galvanising power in society was defunct, having been categorically disproved. People received no benefit from believing that Ashur would grant their army success when his city had been taken over by Christians, and left, critically, without an army. The religion disappeared as hope of that faded and a new religion offered structure and values, which continues to this day on that basis.


History: What we can infer about the past, from what has made it to the present.

I am a young man, a little over twenty, living in China during the Han Dynasty (specifically, the year 180 AD) – and, formerly a peasant, having gotten myself into debt I have been compelled to sell myself into slavery.  It is possible to buy oneself out of slavery, but very unlikely in my own case, and so I do the domestic work in the home of an occultist.
This occultist (an alchemist, to be exact) devotes his time to refining techniques aiming to preserve health and life, and has recently found government employment and support in this, amongst other occultists. Despite the fact that those who devote their lives to the practice of the occult arts are forbidden from holding government office, his choice to do so reflects his conviction and that is respected, although he is held on the same level as a butcher, perhaps. With this in mind, it is not as comfortable a life as the slave of a higher status citizen (i.e the possession of a Chieftain of the Multitude Riding a Four Horse Chariot) – high up in the 20 socioeconomic ranks  – might enjoy, allowed good foods and wines, but it is not the worst. My labour is not exhausting, and I am given a little protection under law. I visit the government-controlled market (taking care to avoid upsetting the slaves of higher officials, who have some influence, especially with people wishing to gain the official’s attention) aid in preparation for hamlet feasts if necessary, and clean. Life is okay.

—————————————-                                                                                                                     Now how do I, Martha Sharp, in 2019, claim, as an historian, to know all of that? History in this context is, to make it clear, the study of what has happened in the past, not the events themselves. We can seek to understand the succession of events from secondary sources, and when many are in agreement, can feel we are sure of this. Surviving records can show us somewhat how people thought and felt about changes taking place, or oral accounts handed down, though these are far more susceptible to people’s wills, changing to suit the views of the eras they pass through, often irreversibly.

So, can I ever state definitively that I know how the life of my Han dynasty slave alter ego was lived? How he felt about the feudal system, within the hamlet, and the external forces controlling it? Did he really believe that it was unfair that the ‘Chieftain of the Multitude Riding on a Four Horse Chariot’ was only so privileged as he had bought his rank from the government, as they attempted to raise funds?

Nietzsche would argue that there can only be perspective and interpretation, these influenced heavily by somebody’s personal motivations and opinions, political standing and social goals, etc. My slave friend almost certainly did not represent his society with an impartial voice and clear judgement, documenting rigidly and examining himself. People just struggle to, even if the document or tale they are composing is, in their description, factual.

But surely these influences are sometimes negligible, not detracting from a ‘truth,’ such as the development of more sophisticated irrigation in the late Zhou dynasty, and its significance in the growth of settled civilisations? To take the middle ground, perhaps an historian must simply take the influences of perspective and interpretation into account, and assess whether they impact the source’s quality to such a degree that it is no longer effective in telling us of the qualities of the era studied. I might brag about my hamlet’s super cool canal, but it still exists, perhaps just it is a little shallower, and a bit more boring than I make out.

The shared information in two sources of opposing opinion can tell us much too, and the perspectives and interpretations themselves are still valid points of historical investigation, particularly in the grounds of social and political changes.

The way I see it, the negation of these factors is a large part of history itself, and can contribute hugely in some areas of the study of past events and conditions. The understanding of the strengths and limitations of sources is key, as we only have what survives to the present day, but from the intersection of many a perspective and interpretation  we can reach what I tentatively call knowledge.

Does our education system fail at encouraging creative thinking?

My immediate response, given the fact that I am writing this only as procrastination as I prepare to write two essays of considerable length on Richard I?


However, I don’t blame the education system for the lack of creative thinking in my IB course – I was given a choice, and I, mortally afraid of creativity (where the bounds of right and wrong are blurred and one cannot perhaps ever rejoice in being entirely, deliciously CORRECT)chose the least creative options available after the sciences. Although I take many essay subjects, its is sometimes merely, to me, like harvesting other people’s ideas and with proper credit given, re-organising them. I’m learning, and am truly inspired by what I am learning – awed in fact –  yet in truth my brain has coughed up nothing original since Year 7, or 8, and I still offer my condolences to my English teacher at the time for endeavouring to mark that. My education has instilled in me a fear of being incorrect, and in reaction, I myself have failed to think creatively.

There are, indeed, the renegades. Those anomalous, freakish children in British classrooms nationwide; if you are looking for them, they are the ones who have their textbooks closed when you ask questions, teachers. They’ll be wrong a lot more than their classmates, certainly, because they are not lifting the words from the page (no child uses the word ‘incongruous’ so naturally) and this may dismay some, but the explanations you may choose to give in response, preferable to a simple ‘No’ and a look to one of the textbook devils, are incredibly important to the development of understanding. The use of imagination even in the context of a Chemistry lesson is important, and should be protected – the textbook will get you an A* at GCSE, but never a cure for cancer.

Our attitudes in the classroom need to change, in order to help our children recover from the fear and stigma surrounding the word ‘No.’
In short, our education system does have room for imagination – but fails to invite it in.

Let’s take the time. Perhaps it’s not those children who have the attitude problem.

The Convenience of the Supernatural


As a child, I was utterly taken by the idea of the supernatural – it explained a lot of things that, at an older age, I now know to be quite simply explained by the fact that my mother did not know the answer and lied. The creaks on the stairs, space flight…you name it. There was a lot of stuff I believed ghosts were up to when I was six.
Now I know better, there is still, however, one experience that has stuck with me. A school trip when I was about nine or ten, to Preston Manor in Brighton – a time-waste, under the guise of a history trip. I know nothing of the history of the place, demonstrative of the less-than-academic nature of the visit, but one thing I do remember is that the place is creepy.


Now what I was led  to believe, pushed, in the spirit of the day (no pun intended) by my teachers, was that ‘invisible agents’ – i.e ghosts, invisible beings that act according to desires and agendas very much like our own, and thus are incredibly relatable and believable – were present and active in the Manor. The poor ghosts were blamed for many an event that did not best please the Manor staff, most memorably the drafts, and the lights turning off at random. Steven Pinker states succinctly in ‘The Evolutionary Psychology of Religion’ (2004), that ‘…in all human cultures people believe that illness and calamity ‘are caused and alleviated by a variety of invisible person-like entities: spirits, ghosts, saints, evils, demons, cherubim or Jesus, devils and Gods.’ Preston Manor’s troubles don’t quite amount to calamity, per se, but the various owners’ searches through the decades for an explanation for their household accidents and malfunctions do seem to be in-keeping with Pinker’s statement.

Belief in the supernatural is prevalent throughout the eras, I believe, in the Manor or otherwise, because uncertainty and fear are, too. At some points in history, a ghost leaping around inside a dress cupboard was just as, if not more, likely than the wood swelling and contracting with the heat. With little evidence for the latter (and the false backing from previous testimonies) it was almost a logical conclusion. It certainly was to me, at ten. It survives to the present day due to human doubt, and its use for entertainment purposes – it terrified me at ten, and earnt the Manor a bit of money.

To conclude, I think these beliefs are generally harmless – so long as they don’t distract us in our search for a scientific truth.

It begins-

Although I have very little, even I was forced to question my knowledge after my first TOK lesson; it’s this aspect of the course that interests me most – a chance to question what we cram.
After all, how do we claim to ‘know’ any of this at all? Who has arrived at these conclusions – and how? Most importantly, am I going to allow others to inform and instruct me forever?
Perhaps I’ve been too passive.

Fearing this, I turned up to our class discussions looking for a solution. I was relieved to find that our first focus was language, and how it assists our claims to knowledge. I’m interested in linguistics, and I hoped that this might be an understandable introduction. Almost.
At least, I gained an understanding of the limitations of some key ideas that language can offer us as we seek certain knowledge – the fallacies of definition, denotation and image theory – and I started to realise how we should ask intelligent questions of ourselves and our everyday assumptions. We ranked eight ‘ways of knowing’ according to our faith in them – and noticed surprising variations, some of which people felt very strongly about indeed. These subtle differences in personal experience, and the emphasis some classmates placed upon certain areas as opposed to others, can certainly influence how we come to a collective conclusion upon whether we ‘know’ something to be true. It’ll be important to be bear in mind as we move forward in discussion, and also in my personal approach to reaching a shared understanding with people on matters – our criteria for certainty will almost certainly differ.

The takeaway? I’m looking forward to TOK as I move forward with the IB, and that this new way of evaluating previously mundane statements (and reaching an ever-complicated mutual understanding) will prove important, but stressful.