Webster’s Twentieth Century Unabridged Dictionary (second edition) defines the following terms thusly:
Fact, n. [L. factum ] 1. anything done; an act; a deed
2. a thing that has happened or is true; a thing that has been or is
3. reality; truth; the state of things as they are
4. something declared to have happened, or have existed; the assertion of something existing or done
5. in law, something that has taken place, either actually or by supposition, distinguished from a purely legal result.
True, a. [AS. trywe] 1. faithful; loyal; constant
2. reliable; certain
3. in accordance with fact; that agrees with reality; not false.
Reality, n. [Fr. realite] 1. the quality or state of being real
2. a person or thing that is real; a fact
3. the quality of being true to life; fidelity to nature
4. in philosophy, that which is real
These words are used routinely in people’s daily lives and are almost never contemplated for very long. The concepts of reality, truth, and fact - when coupled with objectivity, theorization, and experimentation - form the basis of scientific thought. From the ancient Greeks to the present, great minds have pondered the fundamental questions of "What is real?" and "What is true?" Only in the past century has the line differentiating philosophy and the physical world begun to blur. The purpose of this essay is to explore the history of philosophy and science, and relate them to the study of epistemology in particular. Once that framework has been established, the notion of reality in a legal setting, i.e. a court of criminal law, will be explored. The above definitions will be referred to often over the course of the essay. They are the cornerstones of both scientific discovery and our legal system. More directly, they are the basis for epistemology, the branch of philosophy that investigates the origin, nature, methods, and limits of knowledge.
There are countless epistemological theories, and the subject has been pondered as far back as the Greeks, and probably farther. For the sake of brevity, this essay will focus on a select group of theories, beginning with the Platonic, progressing to rationalism, then to empiricism and to Kant’s transcendental idealism. A short survey of modern physics will follow, primarily to demonstrate where the respective studies of philosophy and physics are beginning to intersect. Following this, the essay will discuss the notion of what can be known in a legal setting, specifically a court of law. Various types of evidence will be compared to see which are the most valid.
Socrates and Plato (c. 400 BC) are the most recognizable of all the Greek philosophers. They were contemporaries, as Socrates was Plato’s teacher. While they are pillars in the study of philosophy, this essay will focus mainly on Plato’s contribution to the study of knowledge. This is partly due to the fact that all we know of Socrates is gleaned from what others (mainly Plato) attribute to him. Socrates established the method of the dialectic, a verbal form of logical argumentation. As will be shown later, Plato and Socrates employed the method of dialectic to try to better understand what can be known.
Plato accepted the constraint on any theory of knowledge that both knowledge and its objects must be unchanging (as posited by Parmenides). One consequence of this, as Plato pointed out in Theaetetus, is that knowledge cannot have physical reality as its object. In particular, since sensation and perception demand physical sensations as their objects, knowledge cannot be the same as sensation or perception. That is to say, reality can be perceived, but reality itself extends beyond our ability to perceive it.
The negative thesis of Plato’s epistemology consists, then, in the denial that sense experience can be a source of knowledge on the grounds that the objects apprehended through the senses are subject to change. To the extent that humans have knowledge, they attain it by transcending the information provided by the senses in order to discover unchanging objects. But this can be done only by the exercise of reason, and in particular by the application of the dialectical method of inquiry inherited from Socrates.
The Platonic theory of knowledge is thus divided into two parts. The first is to discover if there exist any unchanging objects and to identify them as such, the second, to illustrate how they could be recognized and understood via reason. Plato used a number of literary devices to illustrate his epistemology. The most notable is the Allegory of the Cave in Book VII of The Republic2.
The allegory depicts people as living in a cave. The cave represents the world of perceivable experience. In the cave people see unreal objects, or images, e.g. shadows. If one endures a process of overcoming the sensible world, Plato argues, he begins to come out of the cave and into reality. The process he mentions is the analog of the dialectical method, which allows one to recognize unchanging objects, and thus gain knowledge. The upward process described in the allegory culminates in one emerging from the cave and seeing the sun. The sun represents of the source of knowledge.
Plato points out that each sense is distinct, and complementary. One can hear sounds, but not colors; the sense of smell captures odors, but cannot hear a bird’s song, and so on. Since knowing is a mental characteristic, there must be some objects it can comprehend. And according to his Parmenidean constraints, these objects must be universal and unchanging.
To illustrate, he describes various triangles; some painted, some etched in dirt, and some etched in stone. They all differ in their style, but they all share the characteristic of triangularity. He refers to objects, such as a triangle painted on a rock or one etched in dirt, as particulars. They always exist somewhere in space and time, in the world of appearance. But the common property they share, their triangularity, is what Plato called a form or idea.
Forms differ from particulars insofar as they do not exist in space-time, and more importantly, they are unchanging. Forms and ideas are objects one must grasp in order to acquire knowledge, while particular things demonstrate various characteristics, but are temporal; the forms of the objects are eternal. The color white does not change, nor does the notion of a triangle. Through the dialectic process of question and answer, one gains a steady collection of real conditions, which define a concept exactly. That collection of knowledge leads to a more exact understanding of reality.
That is to say, for Plato, defining forms is in itself a way of describing what can be known, and thus reality. But that begs the question of how knowledge should be defined. Just because one knows something, does not make it true. Plato recognized this dilemma and asserted in Theaetetus2 that knowledge is in itself a belief, and that some beliefs are false. That is, some things people believe are false. Having knowledge that is true leads to an understanding of reality. It is in Theaetetus that Plato defines knowledge as a justified true belief.
From the beginning of epistemology, there has been a discussion of what fraction of our knowledge is innate versus learned. That delineation is most prominent when one compares rationalists to empiricists. Rationalists tend to emphasize reason or understanding as distinct from self-perception. They tend to subscribe to the belief that some basis of knowledge is innate. Plato certainly falls in line with the rationalists insofar as he felt knowledge could be gained from an intellectual investigation of facts. He did not, however, directly address the notion of innate ideas. Rather, he felt forms existed, and were waiting for us to discover them through rational processes (i.e. dialectics).
Almost 2000 years after Plato published his notions of what knowledge was, Rene Descartes set out to further explain what could be known. One of his major works on the topic is the Discourse on Method (published in 1637), which is written in autobiographical style. In it he sets out four rules for his explorations. The first is "to accept nothing as true which I did not clearly recognize to be so." The second is "to divide each of the difficulties under examination into as many parts as possible." The third, "to carry on reflection in an order being with objects that are most simple and easiest to understand, in order to rise, little by little, or by degrees, to knowledge of the most complex." The fourth, "to be so thorough and general as to be certain of having omitted nothing3." If nothing else, Descartes strived to be rigorous and complete.
Apart from the analytical basis of the rules, and their general concern with arrival at certainty, there is nothing startling or revolutionary about Descartes’ method. It is in Part IV of the Discourse where he reveals his most celebrated idea: "Cogito ergo sum," which in English translates to the famous "I think, therefore I am." It is useful to point out here that there is a branch of philosophy called skepticism, which posits that all arguments are equally bad and we cannot attain absolute truth. The challenge of Skepticism, as Descartes saw it, is portrayed in his Meditations on First Philosophy4, published in 1641. He considered the suppositions that all of one’s beliefs are false, being delusions created by an evil genius (a demon or the devil himself) that has the power to impose those beliefs upon people, without their knowledge. But Descartes claimed it is not possible for all of one’s beliefs to be false, since anyone who has false beliefs is certainly thinking, and the act of thinking implies that the person exists. And if one has the ability conceive his own existence, then he exists and that necessitates truthful recognition of the self.
He distinguished two sources of knowledge: the first, intuition, and the second, deduction. Intuition is the direct apprehension of something that is experienced. The truth of the proposition "I think" is guaranteed by the intuition one has of one’s own experience of thinking. If one could only know that one exists, then the sum of human knowledge would be depressingly low indeed. It was Descartes’ intent to broaden the expanse of human knowledge and his methodology for doing so, for which he is most renowned. After demonstrating that all human knowledge depended upon thought or reason, and not upon sensation or experience (the bread and butter of rationalism), he then proceeds to prove to his own satisfaction (rule number one above) that God exists, and that the mind is easier to know than the body.
In their original form few of his proofs convinced many people. One major problem he encountered is what has become known as the Cartesian circle. In Meditations, he attempts to escape the possibility that the Devil is deceiving him about all he believes, and so he attempts to prove that God exists. In the proof he applies his measures of clearness and distinctness. He then contends that clearness and distinctness are the criteria for all knowledge because God does not deceive man3.
The logic is circular and therefore flawed. On one hand, he claims that his criteria for proving the clearness and distinctness of an idea are sufficient to establish its validity. He follows by noting that God must exist because it is a clear and distinct notion. He later follows with the assertion that clearness and distinctness are valid criteria since God would not deceive man. Effectively, his measure for establishing God’s existence is a measure that is established by God.
In spite of his shortcomings, Descartes is important today for several reasons. He was a pioneer in mathematics, specifically geometry. In terms of epistemology, he is important because he propagated the notion that knowledge is rational and distinct from perceived sensations. While Descartes recognized that some knowledge is inductive or deductive; he contended that the basis of knowledge was innate. Specifically that the principles upon which knowledge is created are innate - a very important supposition, indeed.
As expected, if there is one school of thought that posits that knowledge is only real if it is arrived at rationally, one would expect a school to counter that assertion. That school would be Empiricism.
John Locke and Empiricism
Empiricists have differing views when one considers particular points of philosophy, but the central tenant of the school is that sensory experience is the source of knowledge. There are many fine works to consider, and the first serious works by empiricists were by John Locke, George Berkeley, and David Hume. All three hailed from Great Britain; for that reason they are often referred to as the British Empiricists. The timeframe of their respective works ranges from the mid-1600s to the mid-1700s. John Locke is a reasonable and often used reference for those studying the topic. His work is examined below.
Locke’s treatise, Essay Concerning Human Understanding5, is considered to be the first major empiricist work. It consists of four books whose titles give the reader a good idea of the subject matter, they are respectively: Of Innate Notions, Of Ideas, Of Words, and Of Knowledge and Opinion (from here on referred to as Books I through IV, respectively).
The first book contains a sharp attack on the notion of innate ideas. On the face of it, Locke is concerned with two things: one, whether there is any innate knowledge of principles; and two, whether what he refers to as the "materials" of knowledge, the ideas on which knowledge is based, are innate. He attacks the view that there are innate "speculative principles," e.g. identity and contradiction. He also attacks the notion that calls innate "practical principles" (e.g. principles of morals). The conventional wisdom of Locke’s time (and perhaps of today as well) was that such principles were innate and based their claim on the almost universal approval of them. That is to say, most people agreed that human beings are born with a sense of moral right and wrong, a notion of identity, and an ability to recognize contradictory assertions. Locke denies such principles are universally accepted, and points to the fact that "young children or idiots5" do not universally appreciate such principles. He continues by arguing that there can be no innate ideas either, therefore the materials of knowledge cannot be innate, which means that knowledge itself cannot be innate.
In Book II of the Essay, Locke supposes the mind to be like a blank sheet of paper waiting to be filled. This begs the question: how does the paper get filled? Said Locke, "To this I answer, in one word. Experience.5" He divides experience into two categories: one, observations of external objects and two, observations of the internal workings of the mind. The former is clearly another description for sensation. The latter category is not so easily summed up in one word, but Locke used "reflection" to designate it, because people arrive at ideas by reflecting on the operations of their own minds. To Locke, there is no other source for ideas. On the whole, he insists that in getting ideas of sensation the mind is passive, and he likens it to a mirror in this respect. As examples of reflections Locke offers perceiving, knowing, willing, doubting, thinking, and believing.
Like Descartes, Locke believes in the notion of simple and complex ideas, and that complex ideas are built upon a foundation of simple ones. The agreement ends there. To Locke, an idea is anything that the mind "perceives in itself, or is the immediate object of perception.5"
Locke goes on to describe qualities as the power that objects have to cause ideas. Many words have dual senses. The word red, for example, might mean either the idea of red in the mind or the quality in a body that causes the idea of red in the mind. Some qualities are primary in the sense that all bodies have them. For example solidity, extension, figure, and mobility are primary qualities. Secondary qualities are those powers that, in respect of the primary qualities, cause the sensations of sound, color, odor, and taste. Locke’s view is that the phenomenal redness of a fire engine is not in the fire engine itself, nor is the phenomenal sweet smell of a rose in the flower itself. Rather, certain configurations of the primary qualities cause phenomena such as the appearance of red or the taste of sweetness, and in respect of these configurations the object itself is said to have the quality of redness or sweetness.
Until Locke, these ideas had never been expressed, and while his arguments are not without flaws, his work is still recognized as beneficial in terms of gaining a better understanding of the world in which we live. He was a great influence upon Berkley and Hume, who would go on to further the empiricist view of epistemology.
Kant and Transcendental Idealism
If one were to blend the notions of rationalism and empiricism, the school of idealism would likely emerge. Simply put, an idealist believes that everything that is known is mental; that is, everything that exists is either a mind or depends upon the existence of a mind. German philosopher Immanuel Kant (1724 - 1804) was not an idealist according to this definition, although he often referred to himself as a transcendental idealist.
Kant held that the human self, or "transcendental ego," constructs knowledge out of sense impressions and from universal concepts called "categories" that it imposes upon them6. Kant’s transcendentalism is set in contrast to those of his predecessors – the rationalists and the empiricists – insofar as he believed that ideas, the basis of knowledge, must somehow be due to realities existing independently of human minds; but he held that such things in themselves must remain forever unknown. Human knowledge cannot reach to them because knowledge can only arise in the course of processing the ideas of sense. Humans can know only what is offered to their senses or what is contributed by their own minds. Every sensory experience is a mixture of the sensory content that is presented to a person and a concept of three-dimensional space and time, which is contributed by the mind itself. The last supposition is rather important – Kant is one of the first philosophers to consider the notion that space and time are merely constructs of the human mind, rather than absolute features of reality.
According to Kant, if one formulates a sensory experience into a judgment, then the mind necessarily contributes additional objective features. Judgment incorporates ideas of something being a substance or quality of that substance (Lockean empiricism). Judgment can even conceive causality. In short, according to Kant, sensory inputs make up a small portion of greater human knowledge. The human mind contributes most of the raw material of knowledge. Put another way, insofar as human knowledge is concerned, rather than the mind trying to accommodate itself to the external world, the world conforms to the requirements of human sensibility and rationality. Kant compared his reorientation of the way philosophers ought to study human knowledge to the Copernican revolution in astronomy (more on that later). Just as the Earth revolves around the Sun, contrary to common sense, objects conform themselves to the human mind, also contrary to common sense.
In his monumental work, A Critique of Pure Reason, Kant attempted to replace what he saw as crude empiricism with a more sophisticated approach to understanding reality. Pure empiricism left room for the notion that nothing exists, which Kant summarily rejected as "the absurd conclusion that there can be appearance without anything that appears.6"
Kant does not deny the empirical nature of our knowledge per se; rather he attempts to expose the rational pillars upon which that knowledge is built. He called his idealism "transcendental" because the conditions he was looking for transcend all experience (they are common to all experience). He believed that all objects of sensation must be experienced within the limits of space or time (which are constructs of the mind, ultimately); therefore, all physical objects have some special and temporal location. Because space and time are the backdrop for all sensations, he called them "pure forms of sensibility.6" In addition to these forms, there are also "pure forms of understanding," that is, categories or a general structure of thought that the human mind contributes in order to understand physical phenomena. Thus, every empirical object is thought to have some cause, to be either a substance or part of some substance.
Without doubt, Kant’s epistemology is the most rigorous and sophisticated of any considered herein. A Critique of Pure Reason reads like molasses in November, but is filled with some of the most advanced concepts the subject has ever seen. Even today, 200-plus years after its first publication, philosophy students pore over every detail as though they have come across the Holy Grail of knowledge.
Some of Kant’s views, in particular the notion that space and time are not absolute, but merely constructs of the mind, have been shown in a promising light over the last 100 years. One could argue that the study of physics since the time of Newton has crept ever so close to the esoteric study of epistemology. Modern physics has moved our understanding of the physical world from that which is directly observable (macroscopic) to that which is necessarily indirectly observable (microscopic).
The progression toward understanding is hardly close to complete, but physicists and cosmologists are pondering the question of how the universe came into existence, and they are approaching it from two directions. On one hand, physicists look deep into the heart of matter, in an attempt to understand its fundamental characteristics; on the other, cosmologists look deep into the expanding universe for subtle hints as to the nature of its workings. At this point it would be good to consider the relationship between science and epistemology.
Science and Epistemology
Plato defines knowledge as a justified truth, and through Socrates’ method of dialectic he provides a means of justification. Platonic epistemology gives a sort of timeless path to finding true knowledge; the difficulty, of course, is that some beliefs are harder to justify than others. For example when one considers only the apparent motion of the Sun about the Earth and little else, one can easily deduce that the Sun revolves around the Earth. This in fact, is exactly what was known to be true until Copernicus suggested just the opposite in 1543.
Copernicus arrived at the notion that the Sun was the center of the solar system after years of detailed observations of celestial bodies (besides the sun) across the sky coupled with a willingness to reject conventional knowledge. His observations could only make sense provided conventional knowledge would be abandoned. The Copernican revolution marks the first great victory of knowledge gained through objective observation over the conventions of ignorance and superstition.
Once people were freed of dogmatic chains and allowed to objectively question the world around them, they began to do so with increased frequency. The advent of technologies like the printing press fueled the flames of democracy, free thought, and free expression. As people began to communicate with one another more efficiently, their ideas of how the natural world operated grew more sophisticated, and more accurate.
We have invented, tested and embraced many physical theories and our lives are better for it. Physics attempts to explain physical phenomena in the most objective, meaningful and accurate way possible by means of the scientific method. In essence, Socrates’ dialectic has been replaced with multibillion-dollar particle accelerators and telescopes in orbit.
The logical end of epistemology is a complete understanding of knowledge void of human subjectivity and independent of our existence. Kant said this knowledge is unattainable, that it is beyond the grasp of our ability to even conceive it. Yet scientists grasp for it everyday, seemingly getting closer with each lunge. Are they lunging toward a mirage? Certainly our understanding of how the moon orbits the earth is well understood. Quantum mechanics has proven itself worthy of the term "scientific fact" insofar as it predicts with wonderful accuracy a whole array of physical events. But is the ultimate truth attainable? Is there an objective reality that exists without us? That is to say if the universe was to exist, but we weren’t a part of it, would it still obey the same laws? Ask a physicist the old question about a tree falling in the woods; if no one there to hear it does it make a sound? The physicist would say yes. The notion of objectivity is so thoroughly burned into their psyches that they are incapable of seeing the world in any other way. The fundamental problem with this approach is that we are attempting to place our notions of objectivity upon physical events. Just as we cannot hear sounds as a bat does, our notion of objectivity is limited by our conception of it. What is lacking here is the distinction between our objective models and the objective reality. Had mankind never existed, then all of our models would cease, and so would our version of reality. The universe would likely continue on with its absolute reality fully intact. To put it another way, what we might consider objective may not be objective to another being, from some other planet, living in some other dimension (assuming such a being exists).
The problem seems semantic in nature. All great theories, from Sir Isaac Newton’s Principia to Richard Feynman’s Quantum Electrodynamics, are mathematical models of how physical systems behave. By definition the models are not reality, but merely representations of observed events that have the ability to predict future events. Some are quite good models but they all have their own failings toward completeness. These failings are not newsworthy though because the scientist looks long and hard for places where his model fails, and then tries to adjust the failed part while keeping the rest intact. Occasionally this is not possible as was the case with classical physics on the quantum and relativistic levels. When it came to describing the quantum aspects of nature, classical physics crumbled and the ultimate result was a better theory. Since the theories are admittedly abstract representations of real events one should not mistakenly attribute any sort of absolute truthfulness to them. But what if one considers the mathematics upon which the models are based?
The fundamental feature of all physical models is the mathematics. Since the time of Euclid, mathematics and physics have been shared a symbiotic relationship. Newton invented calculus to better explain the motion of objects. As mathematics became more abstract, physical theories became more sophisticated. Einstein’s General Relativity would not exist if not for the non-Euclidean geometry it is based upon. Post-modern physics, e.g. the standard model, string theory, etc. would not exist if not for very abstract mathematic notions of group theory, lie algebra, and topology. It seems as though mathematics is the master key to understanding what is ultimately knowable, thereby begging the questions: Is there some form of mathematics so pure as to transcend all human thought, reason, and subjectivity? If so, is it even possible for people to find it? Kant would likely answer yes to the former question, and no to the latter question. Any form of "pure" mathematics that is the "true" basis for how physical events transpire cannot be so conceived in the subjective minds of people – no matter how objective they pretend to be. In short, we are anchored by our perceptions, both sensory and intellectual.
In light of this seemingly depressing supposition, one could ask: what then is the point of continuing our pursuit of physical theories? The answer of course is rooted in pragmatism. Our pursuit of objective reality, although ultimately doomed to failure, will lead to a reasonably good facsimile of that reality. Bit by bit better theories will emerge; even if the new theories are not complete descriptions of objective reality they will provide such utility as to be indispensable.
If objective reality is ultimately unattainable via the scientific method, then of what quality must the reality created in a court of law be?
Epistemology and the Law
While philosophers have been contemplating the esoteric nature of knowledge for the past 2500 years, various legal systems have been adjudicating people’s behavior and transgressions as well. In 399 BC Socrates was tried in a Greek court of law for corrupting the morals of the youth, and for religious heresies. He was convicted, sentenced to death, and executed. Although he felt his conviction was unjust, he declined the offers of those wishing to help him escape. He reasoned that in spite of the unjust nature of his conviction, the rule of law must be obeyed. The trial of Socrates is a good starting point for a discussion of epistemology and the legal system.
The Trial of Socrates
The only two surviving accounts of Socrates’ trial are found the writings of two of his students, Plato and Xenaphon. The following description is gleaned from Plato’s version of The Apology2. (Given that both Plato and Xenaphon were students of Socrates, one can reasonably question the fairness of their accounts.) What is known is that in 403 BC a general amnesty was issued to all Athenians for transgressions that took place before or during the rule of the Thirty Tyrants, so Socrates could only have been tried for crimes committed during the last four years of his life. In the latter half of the Fifth Century BC, Athens was a tumultuous place. There were several antidemocratic revolts, and one of the final ones took place in 401 BC. Two young students of Socrates were found to have been involved in the plot, and it seems as though Athenians had come to be weary of the Socrates’ influence upon the youth. The Athenian legal system allowed for charges to be brought forth by any citizen. Socrates was charged by Meletus (a poet) of "refusing to recognize the gods of the State, introducing new divinities, and corruption of the youth.2"
The trial consisted largely of oral testimony, and Socrates spoke in his own defense (Apology) for roughly three hours. During his oration he contended, that he had battled for decades to save the souls of Athenians - pointing them in the direction of an examined, ethical life. In Plato’s version of the Apology Socrates reportedly said to his jurors that if his teaching about the nature of virtue "corrupts the youth, I am a mischievous person." He continued, according to Plato, that he would rather be put to death than give up his soul-saving: "Men of Athens, I honor and love you; but I shall obey God rather than you, and while I have life and strength I shall never cease from the practice and teaching of philosophy.2" If Plato’s account is accurate, the jury knew that the only way to stop Socrates from lecturing about the moral weaknesses of Athenians was to kill him.
The jury consisted of 500 men, who were each allowed to vote guilty or to acquit. There was no formal deliberation and the judge did not issue the jury any instructions on how to treat the evidence, or in what manner they should attempt to determine Socrates’ guilt. When the votes were counted the outcome was guilty by a 60-vote margin7.
Beyond the adversarial nature and trial by jury, the details of our current legal system bear little resemblance to that of Athens. The most striking similarity is the establishment of the facts of the case. This process determines the epistemological framework for determining guilt or granting acquittal.
Epistemology and US Criminal Law
In a nutshell, here is how our criminal justice system is designed to work: A crime is committed. Police investigators try to establish the nature of what transpired, what events took place. They in turn present their findings to the State prosecutor who analyzes the findings and looks for transgressions of the criminal code of law. Once the prosecutor determines which codes have been violated, and the police are able to capture a suspect, the prosecutor may present the charges and some set of evidence to a Grand Jury. If the Grand Jury has reason to believe the validity of both the charges and the evidence, then they will recommend the suspect face the charges in an open court. It is in the court where the framework of facts is presented to a judge/jury. The defense attempts to dispute enough of the facts sufficiently to cause the judge/jury to acquit. The judge/jurors consider the facts of the case and attempt to determine guilt.
Recalling the fifth definition of the word fact from Webster’s, "in law, something that has taken place, either actually or by supposition, distinguished from a purely legal result." Criminal courts have a necessarily narrow channel for admission of facts. Hundreds of years of case law and legislation have culminated into volumes of evidentiary rules and procedures – most notably the Federal and State Rules of Evidence.
Evidentiary rules are necessary, but bland. A more interesting topic of discussion is how the changes in technology coupled with a better understanding of human cognition and psychology work to affect the outcomes of trials.
One of the most damning forms of evidence is corroborated eyewitness testimony. The Findlaw.com legal dictionary defines an eyewitness as "one who sees an occurrence or object or sometimes experiences it through other senses (e.g. hearing)." Countless cases have been decided on the strength of this kind of testimony, in spite of years of studies demonstrating its unreliability. This is not meant to suggest that eyewitnesses intentionally lie to persuade a jury. On the contrary, most people’s honest recollections of events are typically inherently flawed. Among the more renowned experts in the study of eyewitness testimony is Dr. Elizabeth F. Loftus, a cognitive psychologist who has served as an expert witness in many high profile cases, ranging from the Rodney King beating trial, to the trial of Oliver North. After more than twenty years of study she has determined eyewitness testimony to be flawed almost all the time8.
There are several reasons for the fallibility of eyewitness accounts. Obvious physical hindrances such as poor viewing conditions, brief exposure and stress account for part of the problem. The other part is subtler. According to Loftus, people’s internal stereotypes and expectations play a more significant role than one might expect. It begs an important question: why are juries so content to accept these kinds of "facts" in light of such uncertainty? The answer lies in our desire to believe what we see with our own eyes. The average person is not aware of the way his/her mind works in synthesizing memories. During the synthesis, the subconscious quickly fills in any holes or gaps that may exist. It is here where personal bias enters the process. Most eyewitness testimony is what the witness believes to be true; i.e. witnesses are typically not lying on the stand. Nevertheless, the credibility of their stories should be called into question on account of human cognitive psychology, above all other reasons.
Physical evidence represents another powerful set of "facts" presented to jurors. For decades, facets of forensic science have yielded powerful results in the courtroom. With the advent of effective and cheap DNA tests the nature of forensic evidence is changing. Changing along with it are several prior convictions. Once thought to be indisputable, fingerprints, hair samples, and other forms of forensic evidence are now being challenged with the advent of DNA tests.
When James Watson and Francis Crick discovered the molecule deoxyribonucleic acid, or DNA, fifty years ago, it is unlikely they would have dreamed its analysis would be applied to criminal law. Like fingerprints, everyone’s DNA is unique. Unlike fingerprint and hair sample analysis, the methodology of DNA analysis is very precise and the likelihood of mismatch is infinitesimally small provided proper test administration. After gathering relatively small tissue, blood or hair samples believed to belong to the perpetrator the forensic scientist determines the unique sequence of genes that make up the molecule. The same analysis is done to a cell sample gathered directly from the suspect. Once the respective sequences are determined, comparing them for a match is straightforward.
This new technology gives prosecutors a valuable weapon in the courtroom, but defense attorneys equally employ its benefits. For example, the State of Illinois has used DNA testing as a means of reversing the convictions of more than 12 death-row inmates. The process of overturning such a conviction is daunting, but when armed with this nearly indisputable proof, a defense attorney has a legitimate chance of seeing his/her client freed. In all the state had executed fewer people since 1972 than it had exonerated. That prompted then-Governor George Ryan to suspend and ultimately commute (to life in prison) all death penalty convictions in the State. In his opinion, the validity of all capital verdicts was called into question.
There is a common thread here that binds the philosophical study of epistemology, the pragmatic study of physics, and the practical application of evidence in a courtroom. It is establishing what Plato called "justified true beliefs." The way the validity of facts is measured in court is currently undergoing a revolution of sorts, in the form of DNA evidence. It is rapidly increasing its stature in the realm of courtroom facts, and may become the most respected of them all. Given the degree of certainty associated with this type of evidence, one could say that there has never been a more dramatic improvement in the legal system.
From this perspective one can see how the study of epistemology is applicable to the law. In court the presentation of evidence establishes a framework of the reality of the trial. The information the jury is exposed to in the form of evidence does not in itself define the reality. As Kant might point out, in the mind of an individual juror the evidence is synthesized with personal experience and comes to describe the sequence and "facts" of the case. That is to say the evidence is shaped to conform to the juror’s psyche. Since there is so much subjectivity inherent in the process, the system is innately flawed. However, in spite of its flaws the mission of the legal system is a noble one, like those of the philosopher and scientist.
1. Webster’s Twentieth Century Unabridged Dictionary, second edition.
2. The Complete Works of Plato, translated by Grube, p. 1997
3. Discourse on Method, Rene Descartes, translated by LaFleur; p. 1960
4. Routledge Philosophy Guidebook to Descartes and the Meditations, Hatfield p. 2002
5. Essay Concerning Human Understanding, John Locke, edited by Nidditch; p. 1990
6. Critique of Pure Reason, Immanuel Kant, edited by Smith p. 1990
7. A History of Western Philosophy, Bertrand Russell; p. 1976
8. Eyewitness Testimony, Elizabeth Loftus; p. 1996
Martin Murphy has been employed as a Particle Accelerator Operator at Fermi National Accelerator Laboratory in Batavia, IL for the past seven years. He has BS degrees in Physics, Mathematics, and a minor in Education. He is currently pursuing a MBA form North Central Colllege in Naperville, IL with an emphasis in Finance. Since birth he has been an avid Chicago Cubs fan and has little use for the St. Louis Cardinals.