Sudbury Valley, Alternative Education, & the Power of Allowing/Challenging/Trusting Students to take Charge of their Own Learning Process
Free at Last: The Sudbury Valley School by Daniel Greenberg (Pgs 1-11)
Mission statement from the By Laws of the Sudbury Valley School
The Purpose for which this corporation is formed is to establish and maintain a school for the education of members of the community that is founded upon the principle that learning is best fostered by self-motivation, self-regulation, and self-criticism…
What would be the opposite? An educational system in which learning is believed to be best achieved through external-motivation, external-regulation, and external-criticism? And is this not precisely what we see in the realm of compulsory schooling today? Amidst all the fashionable noble-sounding talk we do about preparing the next generation for the future, we do precious little to show respect for the fact that the future is their burden to create. By orienting the learning process around external goals, external rewards, external bribes, external punishment, external rules, and external force, we teach children to react and respond to the universe as if it’s a place where external conditions are the driving forces in their lives. Then we bemoan their lack of initiative, their lack of self-determination, their lack of discipline, and so forth. As much as it threatens the political elites who wish to maintain the power to decide what’s best for everyone else, the open secret of education must be acknowledged: until the person who is being educated is allowed, trusted, respected, and challenged to take primary responsibility for every aspect of their education (ie. what they study, why they study it, when they study, how they study, with whom they study, etc.), all of our externally directed methods of reform will only result in superficial changes in a futile game of “who gets to be the boss” for power-obsessed adults while the very entities who should be at the center of education go ignored.
While I can’t speak for every aspect of the Sudbury Valley approach to education, the foundational principle quoted above is one which I believe we’d all be better off respecting. A student’s educational journey becomes successful not on the day when they get accepted into a college or at the moment when they receive a diploma or degree, but on the day when they embrace and express their power to create the results that matter most to them. The aim of learning isn’t to achieve validation from administrators, parents, or professors, but rather to make a lifelong practice of self-actualization. self-expression, and self-mastery. Regardless of our schooling or our lack thereof, we are educated only to the degree that we are empowered.
On what makes the Sudbury Valley approach distinct:
The school starts from a premise stated by Aristotle over 200 years ago in his famous opening to the Metaphysics: “Human beings are naturally curious.” This implies that people learn constantly, as an innate part of living. It means also that children will learn through following their natural inclinations, doing what they want with their time, all day, every day. Regardless of their ages, from the moment students enter the school, they are on their own, forced to take responsibility for themselves and make all the tough decisions that will determine the course of their lives. The school, with its staff, physical plant, equipment and library serves as a resource that is available when asked for, passive when not. The idea is simple: driven by innate curiosity, which is the essence of human nature, children will make enormous exertions to explore and master the world around them.
As far as learning and teaching were concerned, we wanted people to be able to learn only what they were eager to learn on their own initiative, what they insisted on learning, and what they were ready to work hard at. We wanted them to be entirely free to choose their own materials, and books, and teachers. We felt that the only learning that ever counts in life happens when the learners have thrown themselves into a subject on their own, without coaxing, or bribing, or pressure. And we were sure that teachers working with eager, determined, persistent students would experience unusual satisfaction. In fact, we thought that such an environment would be a paradise for students and teachers alike.
Most of the stress teachers feel is directly related to their struggles to get students to care about things that their students clearly don’t care about it. Rather than trusting and honoring the innate wisdom of the student’s curiosities and interests, we make their preferences and priorities subservient to abstract goals determined by people, most of whom, will never have to pay a price for the outcome of the student’s choices. Curiosity is not a distraction to learning. It’s the foundation of learning. Rather than teaching children to “focus” on things they don’t care about, we should encourage them to harness the power of their own capacity for intrigue. When we make our agendas the humble servant to the student’s curiosity, teaching becomes fun. Effective teaching has nothing to do with playing the role of the learning police. Effective teaching begins when the lesson plan hitches itself to the wagon of a child’s sense of wonder.
On the Sudbury Valley’s mission to foster a sense of responsibility among its students:
More than anything, we wanted people to experience the full meaning of responsibility. We wanted them to know what it is to be a responsible person — not just from books, or lectures, or sermons, but from everyday experience.
The way we saw it, responsibility means that you have to carry the ball for yourself. You, and you alone, must make your decisions, and you must live with them. No one should be thinking for you, and no one should be protecting you from the consequences of your actions. This, we felt, is essential if you want to be independent, self-directed, and the master of your own destiny.
The school we had in mind had to be rooted in this idea. We could not be satisfied with anything less than full personal responsibility and accountability for each person, regardless of age, or knowledge, or achievement. We knew that people would make mistakes this way — but they would know that the mistakes they made were their own, ans so they would be likelier to learn from them. We felt that healthy people would always find successes. We believed it a good thing to let people try whatever they want, whether or not they were sure to succeed, so that they would be mentally prepared to meet an unexpected challenge, or seize an unexpected opportunity.
Learning responsibility is not just a matter of content, but it’s also a matter of context. Students learn how to be responsible not by being forced to do tasks out of a fear of getting into trouble, but by being trusted and challenged to do the hardest (but most exciting) work of all: thinking carefully about what they want, deciding what actions they should take to create what they want, acknowledging and accepting the opportunity costs of their choices, owning their decisions, and learning to live with the consequences of their decisions. These sorts of lessons are simply not learned when such important decisions are constantly being made for them by adults who place a higher priority on forcing students to do the “right” thing than on teaching them how to think for themselves. If students don’t play a very large role in how they learn, the real power of what they learn is lost.
On how they teach children to think about authority:
Fear of power and authority was what we wanted to abolish from the school. We were not concerned about people having authority. Authority in and of itself can be good or bad, depending on many things. Some situations need persons in authority — and apprentice learning situation, for example, or a business. The main question is how people get their authority, and how it is controlled once they get it. You are not afraid of people in a position of power if you understand why they are there, and if you can keep an eye on everything they do. What you are afraid of is arbitrary authority, authority that excludes you from participation, over which you have no control. We were determined that no person in the school, whether student or staff or parent or guest, should have any cause to fear the authority of anyone associated with the school. This more than anything would make it possible for one person to look another straight in the eye regardless of age or sex or position or knowledge or background.
The failure to have a healthy skepticism towards authority it the source of much futility in our world. Consider all the people who feel cheated, ripped off, and victimized not because they didn’t have the power to choose, but because they chose to transfer the locus of such power to authority figures who they blindly believed would function as their saviors. This is not to say that we shouldn’t feel sympathy for such people. Pity them we must. Help them we must. Better still, however, strike the problem at its root by challenging educational systems that encourage and reward such self-victimizing mentalities. People don’t need to be arbitrarily rebellious, but they do need to learn how to think critically about what authority is, why it exists, when it matters, when it doesn’t matter, what it’s limits are, and how to detect and defend themselves against its abuses.
On the uniqueness of the Sudbury environment and the importance of not looking like a stereotypical school setting:
The place doesn’t look or feel like a school at all. The standard “school cues” are missing. It looks more like a home, with many persons going about their varied activities in a determined, yet relaxed, manner. The furniture, the people, and the ambience are not what one might expect to find. Visitors often feel baffled: they look for what they are used to seeing in schools, and don’t encounter it there.
Insights from the Diary of Søren Kierkegaard on writing, reading, the futility of obsessively aiming to please one’s audience, and the salvific effect of doing creative work.
Kierkegaard on Popular Opinion, the Petty Jealousies of Criticism, and the Only Cure for Embitterment in Creative Work
Popova begins by sharing Kierkegaard’s expression of his steadfast determination to remain undistracted by the whimsical and unpredictable demands of the audience in his imagination. For Kierkegaard, one does not stand the chance of becoming a good philosopher if his musings are audience driven rather than conviction driven:
Really, an author’s lot has gradually deteriorated to be the most wretched state of all. An author ordinarily must present himself … hat in hand, bowing and cringing, recommending himself with fine letters of introduction. How stupid: one who writes must understand that about which he writes better than he who reads; otherwise he would not write.
Or one must manage to become a shrewd little pocket-lawyer proficient at gulling the public. — That I will not do, no I won’t; no I won’t — no, the Devil take the whole caboodle. I write the way I want to, and that’s the way it’s going to be; the rest can do what they like, they can stop buying, stop reading, stop reviewing, etc.
The desire and demand for attention in the contemporary blogosphere can easily seduce the aspiring writer into believing that their craft is meaningless if they’re not successfully marketing their content to large audiences. Different writers have different philosophies on this subject, but my contention is that writing is most rewarding when it is approached as a spiritual practice above all else. I see Ernest Hemingway’s way advice to “Write the truest sentence that you know.” as the means and end of writing. While writing is a powerful tool for selling products or influencing people, it’s most effective and fulfilling when the writer is writing things that he or she truly believes, deeply feels, and actually lives. Whenever one’s concern about audiences gets in the way of self-authenticity and self-actualization, nothing worth having follows.
One other noteworthy passage Popova shares gives us a glimpse of Kierkegaard’s philosophy regarding the selection of reading materials:
Everyone today can write a fairly decent article about all and everything; but no one can or will bear the strenuous work of following through a single solitary thought into the most tenuous logical ramifications. Instead, writing trivia is particularly appreciated today, and whoever writes a big book almost invites ridicule. In former days people read big books, and if they did read pamphlets or periodicals they did not quite like to admit it. Now everyone feels duty bound to read what is printed in a periodical or a pamphlet, but is ashamed to have read a big book through to the end, and he fears he may be considered weak in the head.
I therefore have decided to read only the writings of men who have been executed or have risked their lives in some way.
Kierkegaard’s insistence on reading “only the writings of men who have been executed or have risked their lives in some way” may seem a bit overboard or strict, but it conveys a standard that might be useful to the modern reader. As thousands of new blog posts, podcast episodes, pamphlets, and digital books are being published every single day, it behooves the avid learner to abide by a carefully chosen set of guidelines for content consumption. The inescapable fact that pervades the marketplace of ideas is that everything is not worth reading, watching or listening to. One must consciously determine his or her aim and structure their study habits accordingly.
True For You, But Not For Me: Reflections on the Fallacy & Futility of Relativism
Tim Williamson on the Appeal of Relativism
Tim Williamson is a professor of Logic at Oxford University. In his discussion with Nigel Warburton, he talks about the nature of relativism, what makes it so appealing, the shortcomings and dangers of thinking this way, and why it’s a self-refuting and incoherent point of view.
Contrasting relativism with objectivism, the belief that some statements are either true or false independently of the preferences, tastes, and opinions of individuals or groups of individuals, Williamson briefly explains the position of the relativist:
The relativist is someone who doesn’t want to say “I’m right and you’re wrong;” who thinks that everyone has their own point of view; that their point of view is right from its point of view, but not from a different one; and that there’s no bottom line below that about who’s really right and who’s really wrong.
Williamson is careful to distinguish matters of taste from matters of truth. The deliciousness of chocolate ice cream, for instance, may be a matter of taste with no objective criteria for determining the rightness or wrongness of the matter. That is, chocolate ice cream might be delicious to one person while unpleasant to another. When we make claims about the world, however, as opposed to mere claims about our subjective experiences, our claims are capable of being true or false depending on their agreement with certain facts or conditions. If a revisionist historian denies that the Holocaust ever took place, for instance, they would be making a claim that is either true or false. The reality (or non-reality) of past events does not change with our desire for them to be true or untrue. Even though we may disagree about certain things, the difficulties we experience when attempting to settle debates are not evidence for the claim that there is no such thing as truth.
The mere fact that there’s no scientific test that something is the case doesn’t mean that there’s no truth of the matter.
There’s a difference between what’s true and what’s known. If something is unknown, it doesn’t follow that it’s untrue. For example, nobody knows whether there’s life on other planets, but that doesn’t mean that there’s no truth as to whether there’s life on other planets.
Anticipating the objection of the pragmatist who claims that truth, by definition, is that which provides utility, Williamson argues that pragmatism, while useful as a method for organizing our priorities, is inadequate as a theory of truth because it excludes meaningful propositions merely on account on their lack of their lack of usefulness.
Someone like William James seems to have thought the truth is bound up with what we can actually find to be useful, but that’s not a view which has stood the test of time very well. Either there was a mammoth standing on the spot where we now are a hundred thousand years ago or there wasn’t. Nobody’s ever going to find out whether there was or not…Whether there was a mammoth here doesn’t depend on whether it’s useful for us to think that there was. And partly what pragmatist may be insisting on is that there’s no point in asserting that there was a mammoth here or in denying it if we don’t know which, but we don’t need to assert or deny it for it to make sense to say that it’s a meaningful question. It’s just not a question that we’re not in a position to answer.
According to Williamson, the relativist is driven by the fear of being offensive or the concern about appearing close-minded more than they are by an accurate understanding of the relationship between ideas:
They’re afraid that if we have a non-relative of view truth where some things are just true and others are just false, then we’re going to be in the position of saying to people we disagree with “I’m right and you’re wrong” and that’s the kind of thing that we say when we’ve failed to persuade somebody and we’re just insisting on our point of view and, as it were, at that point it seems to come down to a question of power: that whichever one of us is stronger is going to be the one that prevails irrespective of the arguments. And so the relativists tend to think that by avoiding talk of absolute truth and absolute falsity, we can somehow escape from that position where we’re imposing our view on other people.
Many people use religious dogmatism as an example of how a belief in absolute truth can go wrong. While there certainly exists people who speak authoritatively about their beliefs in a close-minded, overconfident, and immodest way, Williamson argues that there is no necessary connection between being a dogmatist and affirming the existence of truth:
The connection that relativist think they see between absolute truth and dogmatism isn’t really there. You can believe that there is such a thing as absolute truth without thinking that you’re in a position to be certain about what it is.
Another common objection raised against the belief in objective truth is that such a view is responsible for the imposition of western values on non-western cultures. Williamson argues that our concern about the imposition of western values is itself an indicator that we aren’t neutral about the truth-value of different points of view. While it may be tragic or unfair for one group of people to impose their views on another group of people, that doesn’t mean that all views are equal.
The people on whom the West was imposing its value and culture were not themselves relativists. They were also people who had their own points of view which they didn’t simply regard as one point of view amongst many. They were just as committed to it as the West was committed to its point of view. It’s simply that they had less military power and so they lost at least in the short-term.
Williamson further makes his case against the incoherence of relativism by showing how the very ability to make distinctions, which is essential for any kind of thinking, relies on the presupposition that some ways of thinking are not equal to other ways of thinking:
The difficulty with relativism about truth is to formulate it as a coherent doctrine. A philosopher who’s often thought of as a relativist about truth, and who did say some things along those lines is Nietzsche, but even he clearly had his own beliefs about how things are and regarded people who disagreed with him as profoundly mistaken. What I’ve been arguing is that just by thinking things, you’re committed to a kind of asymmetry between those who think that way and those who think in some opposite way. It’s a point about what you’re logically committed to in thinking anything. If you have a point of view at all, you’re thinking things are some way and so that implies that there’s an asymmetry between people who think they’re that way and people who think they’re not that way.
People do have disagreements about logic. We can certainly consider the possibility that some of our logical reasoning is mistaken, but if you don’t have any form of reasoning, then nothing that you think has any consequences. So there’s no point in thinking anything.
The mere act of forming a specific thought commits us to the view that we are thinking about some things while simultaneously not thinking about other things. The moment I choose to think I thought, I am logically compelled to affirm that I am thinking about that particular thought and not something else. Suppose for instance that I am thinking about cats. It would be false, under such a circumstance, for someone to say I am not thinking about cats. Or suppose I am feeling sad over the loss of a loved one. It would be false for someone to say I am not feeling sad. If I am thinking about the amount of money in my back account right now, it would be false for someone assert that I am not actually thinking about the amount of money in my bank account right now. Williamson contends that this simple fact of logic is undeniable, but that we’re driven to deny it because we’re afraid it would make us arrogant to say that another person’s claims are false. The antidote to arrogance, authoritarianism, and dogmatism, however, is not the denial of truth, but rather the determination to remain humble in our claims about what we know and respectful in our efforts to communicate it.
Williamson concludes his discussion by showing how relativism ultimately contradicts itself:
Relativism, even if it’s not, at least in its own qualified form, a consistent position, is still something that many people in contemporary western society are attracted by. That may be confusion, but confusion itself can be very influential. For example, you see people making claims that all points of view are of equal value forgetting that that is itself a point of view and presumably they’re claiming that that’s of no more value than the point of view that some points of view are more valuable than others. I think it’s possible to undermine one’s own thinking and one’s own capacity for moral action through confusion.
The conclusion of Williamson’s argument here is difficult to deny. He challenges us to consider two propositions:
Proposition #1: Some ideas are better than others.
Proposition #2: All ideas are equal.
While proposition # 2 is more politically correct, it’s actually self-defeating. If proposition # 2 is true, then one is forced to also accept proposition # 1. In other words, it would be inconsistent for someone to say “all ideas are equal” while condemning the intolerance of those who assert that “some ideas are better than others.” If all ideas are truly equal, then the idea that some ideas are better than others must be equally embraced. But since the idea that some ideas are better than others contradicts the idea that all ideas are equal, the idea that all ideas are equal is self-defeating. The only way to avoid this trap is to accept the fact that some ideas are truly better than others while rejecting the notion that one must be arrogant or close-minded to affirm this.
For Williamson, the current popularity of relativism is a sad reality. “It’s disappointing when people are satisfied with a cheap slogan as a solution to really serious problems about how we live at peace with people with whom we profoundly disagree,” he says.
Perhaps philosophers like Williamson and shows like Philosophy Bites will eventually succeed at convincing people that it’s possible to be logical, rational, and objective in our pursuit of truth without being mean-spirited towards those who disagree with us or overconfident in our own estimation of what we know.
Henry Hazlitt’s Insights on How to See Through Economic Fallacies & Where to Draw the Line of Demarcation Between Good Economists and Bad Economists
Economics in One Lesson by Henry Hazlitt
On the plight and cause of economic fallacies:
Economics is haunted by more fallacies than any other study known to man. This is no accident. The inherent difficulties of the subject would be great enough in any case, but they are multiplied a thousandfold by a factor that is insignificant in, say, physics, mathematics, or medicine—the special pleading of selfish interests. While every group has certain economic interests identical with those of all groups, every group has also, as we shall see, interests antagonistic to those of all other groups. While certain public policies would in the long run benefit everybody, other policies would benefit one group only at the expense of all other groups. The group that would benefit by such policies, having such a direct interest in them, will argue for them plausibly and persistently. It will hire the best buyable minds to devote their whole time to presenting its case. And it will finally either convince the general public that its case is sound, or so befuddle it that clear thinking on the subject becomes next to impossible
In addition to these endless pleadings of self-interest, there is a second main factor that spawns new economic fallacies every day. This is the persistent tendency of men to see only the immediate effects of a given policy, or its effects only on a special group, and to neglect to inquire what the long-run effects of that policy will be not only on that special group but on all groups. It is the fallacy of overlooking secondary consequences.
On the essential difference between a good economist and a bad economist:
In this lies almost the whole difference between good economics and bad. The bad economist sees only what immediately strikes the eye; the good economist also looks beyond. The bad economist sees only the direct consequences of a proposed course; the good economist looks also at the longer and indirect consequences. The bad economist sees only what the effect of a given policy has been or will be on one particular group; the good economist inquires also what the effect of the policy will be on all groups.
On the fundamental lesson upon which the whole of economics rests:
…the whole of economics can be reduced to a single lesson, and that lesson can be reduced to a single sentence. The art of economics consists in looking not merely at the immediate hut at the longer effects of any act or policy; it consists in tracing the consequences of that policy not merely for one group but for all groups.
Nine-tenths of the economic fallacies that are working such dreadful harm in the world today are the result of ignoring this lesson. Those fallacies all stem from one of two central fallacies, or both: that of looking only at the immediate consequences of an act or proposal, and that of looking at the consequences only for a particular group to the neglect of other groups.
Arthur Zajonc on Knowledge as an Invitation to Inquiry
Love as Moral Knowing (Farnam Street Blog)
In this short article from the Farnham Street Blog, Shane Parrish shares insights from an interview with Arthur Zajonc, author of Meditation As Contemplative Inquiry: When Knowing Becomes Love. The passage that gave me pause was Zajonc concise but compelling insight into the nature of knowledge:
Knowledge is not something you can just move across the table, and the other person has it. It’s an invitation to exploration to think, to ideate.
The Christian apologist, Gregory Koukl once described critical thinking as a process that “cannot be taught, but must instead be caught.” This echoes what Zajonc is saying here. In our efforts to learn and teach, we must strive to remember that the cultivation of knowledge is a dynamic process that unfolds over time, not a static one-time experience that can be achieved by having someone tell you what to think.