As I write my house is full of kids--three of my daughters (ages 12 to 18) and four of my nieces and nephews (ages 2 to 8). I am the only adult here. I've been watching the kids interact, and seeing the mutual pleasure they find in being together. Much of their play is about learning--how to play a particular video game, how to make lunch, how to keep track of each other, how to make sure that everyone is having a good time.
Their presence (and a recent conversation with my wife where she pointed out that we are potentially grandparent age (since two of our own children are now over 18) and that she would welcome the presence of some little kids around the place--yikes!) has me thinking about age and what it means for learning.
When my kids were smaller they went for a year to a non-graded charter elementary school called Sundance Mountain School. (It continues to this date, though under a new name, Soldier Hollow Charter School.) I was never sure how good the learning was in a formal sense, but in terms of practical experience, there was something wonderful about 5 year olds and 12 year olds learning science and math together.
In this now-famous talk, Sir Ken Robinson makes the point that among the industrial-era absurdities of schooling is that students are grouped according to their "date of manufacture" rather than some more educational commonality (or difference). It is among the practices that squashes the creativity out of people.
It seems like colleges would be a place where we could learn about the roles of age in learning. After all, most classrooms include students with different dates of manufacture, and particularly in schools where there are many non-traditional students, the age gaps can be quite significant. But there is no reform movement or pedagogical approach (that I know of) that attends to age (with the possible exception of freshman learning communities which group students by age, sometimes to the frustration of faculty who think it makes the classroom "too much like high school.")
Is it the case that once a person reaches, say, 18, that age no longer matters and therefore we make nothing of in in higher education? Are we losing potential learning by just assuming that the age system (which, for example, mandates that you have to be between 17 and 19 to start college) makes sense? Is there some reason to accept things as they are?
Thursday, December 30, 2010
Monday, December 27, 2010
Emergent education, or, Can friends start a college?
There is a long tradition in American civic life--one that I love. It is the tradition of small groups of people, friends often, co-religionists sometimes, bonding together to respond to a social problem. Many things might come out of that response--laws, for example, or organizations, or movements, or communities. But at the core, these responses have always been organized around an ethos of friendship.
The intellectual history of the tradition runs from Tocqueville through Mary Parker Follett and Jane Addams to Jane Jacobs and Myles Horton and Ella Baker to Steven Johnson. The organizational history runs from frontier towns to community organizing and social settlements and folk schools to the civil rights movement and into the movements of today.
Today some of the most important thinking and organizing in this tradition is coming out of evangelical Christianity under the umbrella of "the emergent movement." Emerging Christian organizations have eschewed mega-churches and literalism and are focused on building Christian community out of questions and friendships. Doug Pagitt puts it this way in An Emergent Manifesto of Hope: "The emergent imagination is at its most basic level a call to friendship--friendship with God, with one another, and with the world."
When we talk about trends in education today, we tend to focus on structure and infrastructure: charter schools, standardized testing, technology, for-profit higher ed, assessment and accountability. There is some value in this. But in doing so, it masks the cultural changes that are going on in schooling.
One major cultural tendency is towards standardization, efficiency, and systems. That tendency runs through all of the structural trends in education--systems of charter schools, national tests, system-wide adoption of technology, etc. It is largely about measuring outcomes to create a one-size-fits-most way of education. It values the involvement of parents, students, teachers; but largely as choosers. Pick this school or that one; select this curriculum or that. Leadership is traditional--one person or a small group of experts in charge. Elected or selected.
The other cultural tendency is towards emergence, relationships, and ecosystems as the basis of education. Where the systematizing trend focuses on choice as involvement, emergent education focuses on co-creation as involvement. It can be seen in charter schools, those started by collections of parents and educators who want better options for "their kids." It underlies the way that home-schooling is no longer a parent teaching her/his own kids at home, but instead a network of parents taking that role, and meeting to share curriculum, go on field trips, or expand educational offerings. It is hidden in some portions of the open learning movement and in some versions of technology-enabled education. It is behind collaborative creation of curriculum, and behind efforts to improve advising. It creates flat organizations and has little organizational structure. People lead where they can lead--they play the role they seek (and are best prepared) to play.
One wonders, though, if it has any chance in higher education. One would hope so, since higher ed is the home to some of the worst results of big, efficient, standardized education. But I know of no instance in the recent past where a group of friends got together to talk education and ended up starting a college. This sort of thing happened a lot in the 19th century, where many small towns had their own locally grown colleges. Can it happen today? Can a college emerge?
The intellectual history of the tradition runs from Tocqueville through Mary Parker Follett and Jane Addams to Jane Jacobs and Myles Horton and Ella Baker to Steven Johnson. The organizational history runs from frontier towns to community organizing and social settlements and folk schools to the civil rights movement and into the movements of today.
Today some of the most important thinking and organizing in this tradition is coming out of evangelical Christianity under the umbrella of "the emergent movement." Emerging Christian organizations have eschewed mega-churches and literalism and are focused on building Christian community out of questions and friendships. Doug Pagitt puts it this way in An Emergent Manifesto of Hope: "The emergent imagination is at its most basic level a call to friendship--friendship with God, with one another, and with the world."
When we talk about trends in education today, we tend to focus on structure and infrastructure: charter schools, standardized testing, technology, for-profit higher ed, assessment and accountability. There is some value in this. But in doing so, it masks the cultural changes that are going on in schooling.
One major cultural tendency is towards standardization, efficiency, and systems. That tendency runs through all of the structural trends in education--systems of charter schools, national tests, system-wide adoption of technology, etc. It is largely about measuring outcomes to create a one-size-fits-most way of education. It values the involvement of parents, students, teachers; but largely as choosers. Pick this school or that one; select this curriculum or that. Leadership is traditional--one person or a small group of experts in charge. Elected or selected.
The other cultural tendency is towards emergence, relationships, and ecosystems as the basis of education. Where the systematizing trend focuses on choice as involvement, emergent education focuses on co-creation as involvement. It can be seen in charter schools, those started by collections of parents and educators who want better options for "their kids." It underlies the way that home-schooling is no longer a parent teaching her/his own kids at home, but instead a network of parents taking that role, and meeting to share curriculum, go on field trips, or expand educational offerings. It is hidden in some portions of the open learning movement and in some versions of technology-enabled education. It is behind collaborative creation of curriculum, and behind efforts to improve advising. It creates flat organizations and has little organizational structure. People lead where they can lead--they play the role they seek (and are best prepared) to play.
One wonders, though, if it has any chance in higher education. One would hope so, since higher ed is the home to some of the worst results of big, efficient, standardized education. But I know of no instance in the recent past where a group of friends got together to talk education and ended up starting a college. This sort of thing happened a lot in the 19th century, where many small towns had their own locally grown colleges. Can it happen today? Can a college emerge?
Thursday, December 23, 2010
10 reasons why general education should come at the end, not the beginning, of college
Nearly every campus in the United States front-loads general (or liberal) education. At many schools, students take all of their GE courses in the first two years on campus. Even those schools whose GE programs include upper-division courses place most of the GE credits in the freshman and sophomore years.
Here are some key reasons why schools should consider reversing the GE/major sequence:
1. Students arrive with an inherent mis-understanding of GE. Several years ago my colleagues at BYU and I polled freshmen on their views of GE. Most thought it was a continuation of high school. Of course, many students treat the courses as a continuation of high school.
2. More and more students bring AP credits with them to college. Those credits routinely count for GE courses, thus causing havoc even with the best-designed GE curricula (or if a school decides to accept AP credit only for placement and credits toward graduation, then the AP/GE problem breeds resentment).
3. Passion leads to engagement and retention. Most students coming to college have some passion in the curriculum. Front-loading GE defers real engagement with a student's areas of passion, replacing it with courses that the student may not engage with.
4. Faculty mentoring is essential for engagement and retention. And the more closely that mentoring is attached to a student's passion and major, the more durable and meaningful the relationship. Some students find their mentors in GE. Many more find them in their major.
5. Employment prospects depend on working in the field prior to graduation. More and more employers expect that their new hires have meaningful work experience prior to hiring. Placing the major at the end of the curriculum means many students do not get that meaningful work prior to graduation because they are not prepared for it. Completing the major by the end of the junior year gives students a year to begin working in the field (be it in paid or unpaid jobs) prior to going on the market.
6. Employers want students with GE skills--communication, critical thinking, teamwork, etc. They are generally disappointed in what their new employees bring. There are two curricular reasons why this is the case. First, most of the GE skills get practiced in the first two years of college, but only vaguely or implicitly reinforced in the major. Second, many of these skills are discipline-specific. Placing a substantial portion of GE after the major ensures that the student will be able to connect their major to the GE skills they practice at the end of their college experiences.
7. GE is about making connections across the disciplines. When students don't know the disciplines, they are hard to connect. Students know little about the disciplines in their first couple of years. And they certainly don't know enough to connect their area of passion--their major--to the disciplines until after substantial engagement with that major.
8. Students need an opportunity to sum-up prior to going into the world. A key part of GE is reflecting on learning, summing up and taking stock of how one fits into the world. With a GE-first model, that sort of purposeful, curriculum-based summing up is rare.
9. Colleges need a chance to make their case to students. Most colleges and universities believe that important things happen to students in GE. They become more mature, they join the human conversation, and they understand how life at a particular college helped shape them. These beliefs are by-and-large true. But if GE is doing this in the first couple of years, by the end of the college experience the student may not associate these outcomes with the college, but instead with the major. If colleges want to hang onto their alumni, GE at the end helps.
10. Students are ready to engage with the big questions at the point of graduation. Anyone who has taught a freshman seminar and a senior seminar on the same topic knows that the discussion and learning are richer at the senior than the freshman level. If GE is in part about these big issues--justice, community, truth, beauty--then the time to focus on them is when students are ready. Or in other words, at the end of their college experiences.
Here are some key reasons why schools should consider reversing the GE/major sequence:
1. Students arrive with an inherent mis-understanding of GE. Several years ago my colleagues at BYU and I polled freshmen on their views of GE. Most thought it was a continuation of high school. Of course, many students treat the courses as a continuation of high school.
2. More and more students bring AP credits with them to college. Those credits routinely count for GE courses, thus causing havoc even with the best-designed GE curricula (or if a school decides to accept AP credit only for placement and credits toward graduation, then the AP/GE problem breeds resentment).
3. Passion leads to engagement and retention. Most students coming to college have some passion in the curriculum. Front-loading GE defers real engagement with a student's areas of passion, replacing it with courses that the student may not engage with.
4. Faculty mentoring is essential for engagement and retention. And the more closely that mentoring is attached to a student's passion and major, the more durable and meaningful the relationship. Some students find their mentors in GE. Many more find them in their major.
5. Employment prospects depend on working in the field prior to graduation. More and more employers expect that their new hires have meaningful work experience prior to hiring. Placing the major at the end of the curriculum means many students do not get that meaningful work prior to graduation because they are not prepared for it. Completing the major by the end of the junior year gives students a year to begin working in the field (be it in paid or unpaid jobs) prior to going on the market.
6. Employers want students with GE skills--communication, critical thinking, teamwork, etc. They are generally disappointed in what their new employees bring. There are two curricular reasons why this is the case. First, most of the GE skills get practiced in the first two years of college, but only vaguely or implicitly reinforced in the major. Second, many of these skills are discipline-specific. Placing a substantial portion of GE after the major ensures that the student will be able to connect their major to the GE skills they practice at the end of their college experiences.
7. GE is about making connections across the disciplines. When students don't know the disciplines, they are hard to connect. Students know little about the disciplines in their first couple of years. And they certainly don't know enough to connect their area of passion--their major--to the disciplines until after substantial engagement with that major.
8. Students need an opportunity to sum-up prior to going into the world. A key part of GE is reflecting on learning, summing up and taking stock of how one fits into the world. With a GE-first model, that sort of purposeful, curriculum-based summing up is rare.
9. Colleges need a chance to make their case to students. Most colleges and universities believe that important things happen to students in GE. They become more mature, they join the human conversation, and they understand how life at a particular college helped shape them. These beliefs are by-and-large true. But if GE is doing this in the first couple of years, by the end of the college experience the student may not associate these outcomes with the college, but instead with the major. If colleges want to hang onto their alumni, GE at the end helps.
10. Students are ready to engage with the big questions at the point of graduation. Anyone who has taught a freshman seminar and a senior seminar on the same topic knows that the discussion and learning are richer at the senior than the freshman level. If GE is in part about these big issues--justice, community, truth, beauty--then the time to focus on them is when students are ready. Or in other words, at the end of their college experiences.
Wednesday, December 22, 2010
Generosity, debt reduction, and civic life
Rates of personal debt and corporate debt are in decline. Rates of personal savings and corporate savings are up. Banks are "sitting on" (whatever that means--odd phrase) over one trillion dollars of excess reserves; corporations 3 trillion.
The decline in indebtedness is generally seen as a good thing--a reassertion of the old American value of thrift, a marker of the end of "consumer culture." I understand this. And I am pleased that banks are re-capitalizing. But I wonder what it means in the context of this fact: in this difficult economic time, rates of personal giving are down, (see also here and here) even while need is up.
Of course the simplest explanation is that people's budgets are tighter, and so they can give less. Or its variant, that with unemployment at 9.6% there are simply fewer people who can share their wealth. I am willing to accede to this argument. But only up to a point.
Because one thing underlies both debt and giving--the belief that making a promise to another entity about our future behavior is a good thing. Taking out a loan is, at its most basic level, a wager that things will be better in the future. You make that assumption by connecting with an entity--a bank perhaps, but as often a family member or friend--which is willing to invest in you on the assumption that things will get better too. (After all, no one loans money on the assumption that it will not be repaid.)
Generosity carries the same assumption--that in giving, both the recipient and the giver will be better off. The improvement takes place in three places: in the life of the recipient, in the life of the giver, and in the relationship between the two of them.
So when lending and giving are down, there are impacts beyond the economic ones. And key among those impacts is the effect on civic life. When there are fewer connections to other people, civic life becomes coarser, less intertwined, more selfish. We certainly see this in politics; if the decline in giving and lending is being occasioned by a decline in the willingness to wager with others on a better future, we will soon see it in our communities as well.
The decline in indebtedness is generally seen as a good thing--a reassertion of the old American value of thrift, a marker of the end of "consumer culture." I understand this. And I am pleased that banks are re-capitalizing. But I wonder what it means in the context of this fact: in this difficult economic time, rates of personal giving are down, (see also here and here) even while need is up.
Of course the simplest explanation is that people's budgets are tighter, and so they can give less. Or its variant, that with unemployment at 9.6% there are simply fewer people who can share their wealth. I am willing to accede to this argument. But only up to a point.
Because one thing underlies both debt and giving--the belief that making a promise to another entity about our future behavior is a good thing. Taking out a loan is, at its most basic level, a wager that things will be better in the future. You make that assumption by connecting with an entity--a bank perhaps, but as often a family member or friend--which is willing to invest in you on the assumption that things will get better too. (After all, no one loans money on the assumption that it will not be repaid.)
Generosity carries the same assumption--that in giving, both the recipient and the giver will be better off. The improvement takes place in three places: in the life of the recipient, in the life of the giver, and in the relationship between the two of them.
So when lending and giving are down, there are impacts beyond the economic ones. And key among those impacts is the effect on civic life. When there are fewer connections to other people, civic life becomes coarser, less intertwined, more selfish. We certainly see this in politics; if the decline in giving and lending is being occasioned by a decline in the willingness to wager with others on a better future, we will soon see it in our communities as well.
Labels:
civic engagement,
civility
Tuesday, December 21, 2010
risk, risk management, and learning
Bryce Bunting, Derek Bitter, and I have been thinking together about the role of risk in learning. In this midst of thinking about risk, I came across Peter Bernstein's book Against the Gods: The Remarkable Story of Risk.
Bernstein argues that risk has a history, one that is tied up with the mathematics and concepts of probability. Before the notion of probability, the future was either radically certain (you did what had always been done, you went to heaven if you were good or hell if you were bad) or radically uncertain (one day, unexpectedly, you died). Probability allowed people from the renaissance on to predict with some certainty the outcome of an action, and then to decide whether to pursue that action based on how much risk they were willing to take on.
When educators talk about the importance of risk in learning, they generally mean that by asking a student to do something with an unknown and potentially scary outcome, they get deeper learning. In a recent TED talk, for example, Diane Laufenberg describes how she asks students to research, plan, and carry out projects that respond to real world problems. These are risky activities--hosting an election debate, for example. She makes a strong case that there is better learning in these activities than in rote learning, or in learning where there is a single right answer to a problem.
Well enough. But in thinking this way, educators take a pre-probability view of risk. Or in other words, educators focus on the role of uncertainty or indeterminacy in learning. Educators value uncertainty.
Students, on the other hand, live in a world of probabilities. They are risk managers, constantly adjusting their priorities, time, and relationships in order to get the best likely outcome. Hence the questions about what will be on the test, or whether there is extra credit; hence the requests for an extra point here and there. In doing these things, students are managing their risks, gathering information that will allow them to more accurately predict the results of their actions. Students value strategy.
What does the strategic orientation to risk among students mean to teachers? Do you derail the strategic orientation by doing away with grades? Should schools offer only one course at a time so that course gets the student's entire attention? Or are there ways to take advantage of student risk management to get good learning?
Bernstein argues that risk has a history, one that is tied up with the mathematics and concepts of probability. Before the notion of probability, the future was either radically certain (you did what had always been done, you went to heaven if you were good or hell if you were bad) or radically uncertain (one day, unexpectedly, you died). Probability allowed people from the renaissance on to predict with some certainty the outcome of an action, and then to decide whether to pursue that action based on how much risk they were willing to take on.
When educators talk about the importance of risk in learning, they generally mean that by asking a student to do something with an unknown and potentially scary outcome, they get deeper learning. In a recent TED talk, for example, Diane Laufenberg describes how she asks students to research, plan, and carry out projects that respond to real world problems. These are risky activities--hosting an election debate, for example. She makes a strong case that there is better learning in these activities than in rote learning, or in learning where there is a single right answer to a problem.
Well enough. But in thinking this way, educators take a pre-probability view of risk. Or in other words, educators focus on the role of uncertainty or indeterminacy in learning. Educators value uncertainty.
Students, on the other hand, live in a world of probabilities. They are risk managers, constantly adjusting their priorities, time, and relationships in order to get the best likely outcome. Hence the questions about what will be on the test, or whether there is extra credit; hence the requests for an extra point here and there. In doing these things, students are managing their risks, gathering information that will allow them to more accurately predict the results of their actions. Students value strategy.
What does the strategic orientation to risk among students mean to teachers? Do you derail the strategic orientation by doing away with grades? Should schools offer only one course at a time so that course gets the student's entire attention? Or are there ways to take advantage of student risk management to get good learning?
Sunday, December 19, 2010
Can experiential education teach wisdom?
Real world experience teaches two things (at least)--how to do something and, over time, how to make sense of that thing in the world. The "how to make sense" part, in morally complex settings, becomes wisdom.
So, for example, when a first child is born, her parents learn how to parent--how to feed and clothe and comfort and educate their daughter. But they also learn harder things--how to discipline, how to choose between competing needs, how to suffer because of and with the child, how to find joy.
Or, as Confucius purportedly put it: By three methods we may learn wisdom: First, by reflection, which is noblest; Second, by imitation, which is easiest; and third by experience, which is the bitterest.
Experiential education (as separate from experience) focuses overwhelmingly on how to do things. So, for example, if you want students to learn how to run a genetics experiment, then have them run an experiment. They will make mistakes (because doing something is riskier than learning about something) and those mistakes, together with a smidgen of success and guidance from the teacher, will become understanding of how to do something.
But does learning in this way also make students wise? I have been around variations of experiential education for most of my career, but I cannot think of a time when I, or anyone else, focused explicitly on wisdom as a result of our service-learning, or undergraduate research, or simulation, or group project (or whatever the experiential education happens to be.)
Occasionally wisdom shows through in student reflections, but it almost always has to sneak through whatever the assigned reflection is. And increasingly, it seems, reflection focuses more on content acquisition than about the student becoming better acquainted with how she wants to be in the world.
Consider the typical reflection prompt: "What did doing X teach you about [the topic of the class]? Or even the widely used ABC model of reflection; "How did doing X affect you? How did it influence your behavior? how did it change your cognition?
I don't have good ideas about learning wisdom through education, let alone experiential education. Outside of schools, wisdom comes from religious practice, or from failure, or from age. Does it come from anywhere inside of schools?
So, for example, when a first child is born, her parents learn how to parent--how to feed and clothe and comfort and educate their daughter. But they also learn harder things--how to discipline, how to choose between competing needs, how to suffer because of and with the child, how to find joy.
Or, as Confucius purportedly put it: By three methods we may learn wisdom: First, by reflection, which is noblest; Second, by imitation, which is easiest; and third by experience, which is the bitterest.
Experiential education (as separate from experience) focuses overwhelmingly on how to do things. So, for example, if you want students to learn how to run a genetics experiment, then have them run an experiment. They will make mistakes (because doing something is riskier than learning about something) and those mistakes, together with a smidgen of success and guidance from the teacher, will become understanding of how to do something.
But does learning in this way also make students wise? I have been around variations of experiential education for most of my career, but I cannot think of a time when I, or anyone else, focused explicitly on wisdom as a result of our service-learning, or undergraduate research, or simulation, or group project (or whatever the experiential education happens to be.)
Occasionally wisdom shows through in student reflections, but it almost always has to sneak through whatever the assigned reflection is. And increasingly, it seems, reflection focuses more on content acquisition than about the student becoming better acquainted with how she wants to be in the world.
Consider the typical reflection prompt: "What did doing X teach you about [the topic of the class]? Or even the widely used ABC model of reflection; "How did doing X affect you? How did it influence your behavior? how did it change your cognition?
I don't have good ideas about learning wisdom through education, let alone experiential education. Outside of schools, wisdom comes from religious practice, or from failure, or from age. Does it come from anywhere inside of schools?
Friday, December 10, 2010
Seeking the Vice Presidency of the United States in 2012
With Sarah Palin's new reality show and the posturing over tax cuts, don't ask/don't tell, and the START treaty, the election season of 2012 has begun. And so I believe it is time to announce my candidacy for the Vice President of the United States.
Why seek an office that has been unfavorably compared with a "bucket of warm piss"? Why not seek the "leader[ship] of the free world"? For one, seeking the presidency is an act of tremendous self-regard. For another, once a person declares for the presidency, then the focus is on that person's prospects and personality, not ways of working or orientation to the world. And more significantly, by starting with the Vice Presidency and building a group--prospective Secretaries of State and Treasury and Defense--voters will have the chance to consider the entire team, not just its most prominent member. So I'm recruiting for the "minor" positions. We will get to the presidency when we have some time.
What will we do? We will not speak of "the American [anything]." No mention of the American people--there is no such thing, just shifting coalitions of people living in the United States. No "American economy." The economy is a complicated system stretching around the globe and focusing in towns and neighborhoods and homes. There is no line where the American economy ends and others begin.
We will not suggest that the choices are "either/or". Everything is "neither/and." For example, the current debate is not about a tax cut for the wealthy or about creating jobs. It is about both. And about neither.
We will not take responsibility for anything that we are not responsible for. Nor will we blame anyone or any other party for something. The President does not fix or ruin the economy. No one has that much influence. We live in an interconnected world--at best we can shake one part of the web. So we ought not to be too proud of our ability to actually do things, or too quick to claim that our opponents have done something.
We will not solve problems. Problems, at least serious ones, don't get solved. They get worked on, and the solution leaves other issues still to work on. Governments don't solve problems, they pick their favorite version and struggle against it.
We will raise taxes and cut programs because our government is both too poor and too big. We will become increasingly unpopular and be happy with that.
We will point towards a future where groups of people can work on the problems that they favor at the level they can work on them. We will be libertarian in politics and communitarian in organization. We will expect that the government of the United States will remain a defender of the liberties of the people who reside in its borders and the setter of aspirations. But it will not run the programs or make the decisions about how to get to those aspirations. So we may want all 18-year olds to graduate high school. Excellent aspiration. Let communities and schools and parents start working. In other words, more judiciary, more rule of law; fewer laws, smaller executive branch. Government as accrediting body.
We will hope for a future where the United States is a big Switzerland--prosperous, free, democratic, neutral, and less concerned about its place on the world stage and the use of power than about helping people and nations work out their difficulties even if it makes us seem weak.
I am of course jesting--a person like me has no chance of becoming Vice President. A platform like this, that focuses on rhetoric and process, means close to nothing in our system. And a plan like this to circumvent the electoral circus and the hubris of the Presidency stands no chance. But I am serious about the future I would like to see and the pathway to it. Anyone interested?
Why seek an office that has been unfavorably compared with a "bucket of warm piss"? Why not seek the "leader[ship] of the free world"? For one, seeking the presidency is an act of tremendous self-regard. For another, once a person declares for the presidency, then the focus is on that person's prospects and personality, not ways of working or orientation to the world. And more significantly, by starting with the Vice Presidency and building a group--prospective Secretaries of State and Treasury and Defense--voters will have the chance to consider the entire team, not just its most prominent member. So I'm recruiting for the "minor" positions. We will get to the presidency when we have some time.
What will we do? We will not speak of "the American [anything]." No mention of the American people--there is no such thing, just shifting coalitions of people living in the United States. No "American economy." The economy is a complicated system stretching around the globe and focusing in towns and neighborhoods and homes. There is no line where the American economy ends and others begin.
We will not suggest that the choices are "either/or". Everything is "neither/and." For example, the current debate is not about a tax cut for the wealthy or about creating jobs. It is about both. And about neither.
We will not take responsibility for anything that we are not responsible for. Nor will we blame anyone or any other party for something. The President does not fix or ruin the economy. No one has that much influence. We live in an interconnected world--at best we can shake one part of the web. So we ought not to be too proud of our ability to actually do things, or too quick to claim that our opponents have done something.
We will not solve problems. Problems, at least serious ones, don't get solved. They get worked on, and the solution leaves other issues still to work on. Governments don't solve problems, they pick their favorite version and struggle against it.
We will raise taxes and cut programs because our government is both too poor and too big. We will become increasingly unpopular and be happy with that.
We will point towards a future where groups of people can work on the problems that they favor at the level they can work on them. We will be libertarian in politics and communitarian in organization. We will expect that the government of the United States will remain a defender of the liberties of the people who reside in its borders and the setter of aspirations. But it will not run the programs or make the decisions about how to get to those aspirations. So we may want all 18-year olds to graduate high school. Excellent aspiration. Let communities and schools and parents start working. In other words, more judiciary, more rule of law; fewer laws, smaller executive branch. Government as accrediting body.
We will hope for a future where the United States is a big Switzerland--prosperous, free, democratic, neutral, and less concerned about its place on the world stage and the use of power than about helping people and nations work out their difficulties even if it makes us seem weak.
I am of course jesting--a person like me has no chance of becoming Vice President. A platform like this, that focuses on rhetoric and process, means close to nothing in our system. And a plan like this to circumvent the electoral circus and the hubris of the Presidency stands no chance. But I am serious about the future I would like to see and the pathway to it. Anyone interested?
Labels:
choice,
civic engagement,
civility
what can a screaming doll teach a 15-year old?
For the past week my 9th grade daughter carried first a sack of flour dressed like a baby, and then a computerized baby doll with her 24 hours a day. The experience is part of the curriculum of Teen Living, a course required in the Utah state curriculum.
The express purpose of this assignment is to make caring for the baby doll so onerous that teens do not get pregnant. To this end, caring for the sack of flour means the student has to wake up at 2 AM each night and carry it around the house for 15 minutes. The computer baby comes with a key that gets taped to the student's wrist. At random times throughout the day and night the doll starts screaming. When it does the student puts the key in a slot in the doll's back. It immediately stops crying, but the student must hold the key there until the doll cries again (usually between 5 and 15 minutes) at which point you pull the key out and the doll quiets for another couple of hours.
The screaming doll anti-pregnancy project is remarkable for two reasons. First, it requires an enormous accommodation on the part of the school, where all day long for months on end kids carry sacks of flour or screaming baby dolls to class. In a school culture where a word out of turn or a t-shirt with an offensive slogan can result in suspension, the school's willingness to allow the dolls is incredible. (The patience of parents and siblings is equally noteworthy. We made our daughter sleep in the basement, since the noise of the doll woke the entire family each time it cried...)
But the project is also fascinating for the amount of trust it puts in experiential education to change teen behavior. I have no idea whether there is any data on projects like these, but school superintendents and legislators must be confident enough in the doll's power to pay for the things.
One wonders why this is the case. After all, aside from this and driver's ed, there is no other place where the school mandates that students learn by doing. So is there something about danger (of pregnancy, or dying in an auto accident) that makes the schools trust experiential education? Or is there something about the power of embarrassment, either by a screaming doll or a poorly driven car, that educators think will help students learn?
In a comment on a previous post, Derek Bitter wondered what risk in the classroom might look like. Here is a real example--the risk of having attention drawn to you used to discourage risky behavior. But it raises a further question--can risk teach what you want it to? In my daughter's experience the answer is a qualified no. When we talked about what she learned she mentioned that she learned patience and a bit about how tiring it can be to care for a baby. Did she think it would keep kids from getting pregnant? Not really, she said.
Carrying the doll went pretty well until one night at her band concert a fellow band member stole her doll. he kept it from her, stuck it in his pants, humiliated her. When I got to the show she was in tears. Risk doesn't always teach the lesson we think it will.
The express purpose of this assignment is to make caring for the baby doll so onerous that teens do not get pregnant. To this end, caring for the sack of flour means the student has to wake up at 2 AM each night and carry it around the house for 15 minutes. The computer baby comes with a key that gets taped to the student's wrist. At random times throughout the day and night the doll starts screaming. When it does the student puts the key in a slot in the doll's back. It immediately stops crying, but the student must hold the key there until the doll cries again (usually between 5 and 15 minutes) at which point you pull the key out and the doll quiets for another couple of hours.
The screaming doll anti-pregnancy project is remarkable for two reasons. First, it requires an enormous accommodation on the part of the school, where all day long for months on end kids carry sacks of flour or screaming baby dolls to class. In a school culture where a word out of turn or a t-shirt with an offensive slogan can result in suspension, the school's willingness to allow the dolls is incredible. (The patience of parents and siblings is equally noteworthy. We made our daughter sleep in the basement, since the noise of the doll woke the entire family each time it cried...)
But the project is also fascinating for the amount of trust it puts in experiential education to change teen behavior. I have no idea whether there is any data on projects like these, but school superintendents and legislators must be confident enough in the doll's power to pay for the things.
One wonders why this is the case. After all, aside from this and driver's ed, there is no other place where the school mandates that students learn by doing. So is there something about danger (of pregnancy, or dying in an auto accident) that makes the schools trust experiential education? Or is there something about the power of embarrassment, either by a screaming doll or a poorly driven car, that educators think will help students learn?
In a comment on a previous post, Derek Bitter wondered what risk in the classroom might look like. Here is a real example--the risk of having attention drawn to you used to discourage risky behavior. But it raises a further question--can risk teach what you want it to? In my daughter's experience the answer is a qualified no. When we talked about what she learned she mentioned that she learned patience and a bit about how tiring it can be to care for a baby. Did she think it would keep kids from getting pregnant? Not really, she said.
Carrying the doll went pretty well until one night at her band concert a fellow band member stole her doll. he kept it from her, stuck it in his pants, humiliated her. When I got to the show she was in tears. Risk doesn't always teach the lesson we think it will.
Saturday, December 4, 2010
Imagine there's no...leader
What if there were no Department Chairs. Or Deans. Or, for that matter, Presidenst, Provosts, or any of the cabinet-level VPs that are part of today's higher education leadership. Is it possible for a college or university to succeed without titular leaders?
In asking, I am not complaining about any of the leaders at Westminster. Our campus is fortunate to have a leadership slate that is both hard-working and unusually committed to the institution. But it is the case that at one time or another most faculty and staff have wondered about the usefulness of the leadership corps, both here and on every campus. And many have speculated that it is leaders, not faculty and staff, who stand in the way of real innovation and real quality in higher education. So it is worth asking what conditions would be necessary for campuses to imagine there's no leadership.
The business school at Westminster is called the Bill and Vieve Gore School of Business. It is named after Bill and Vieve Gore, alumni of the college and later the founders of W. L. Gore and Associates. Gore is renowned both for its products (Gore-Tex being the most famous) but also for its organization and culture, both of which are designed for innovation. One of the features of the organization is a lack of hierarchy almost unheard-of in corporate America. There are a few "leaders" but most associates play significant leadership roles in proposing, designing, and building products. So there is certainly no hierarchy. (For a great book on organizations like Gore and their strengths, take a look at my friend Jeff Nielsen's The Myth of Leadership: Creating Leaderless Organizations or my colleague Melissa Koerner's blog, "High Performance Organizations." )
When recruiting new MBA students we often run them through a case on Gore, both as an introduction to our pedagogy and to draw a connection between Gore and Associates and the Gore School of Business. A couple of evenings ago I led the case discussion at the recruiting event. Doing so evoked some of my own interests in self-organization and the ways that social movements can emerge without formal leaders. (Take a look at Steven Johnson's Emergence or Jane Jacobs, The Death and Life of Great American Cities, or, if you can find them, my obscure book chapters on educational change, "Neighborhoods and Networks" and "Making Moral Systems of Education.")
That same day, the management faculty and I began to talk about how to select a new chair of their department--the existing chair is taking another assignment at the college. Doing so is no easy thing, since all of the faculty are busy, some have other administrative loads, and few are interested in the combination of tasks that a department chair carries. It is the overlap of these things--talking about Gore and looking for a department chair--that raised the question about whether higher ed institutions could flourish without leaders.
So how do you come at the question of whether colleges and universities need leaders? Two ways come to mind--first by asking what those leaders do that would not happen in their absence, and second by wondering whether colleges and universities have the sorts of cultures that support leaderless-ness.
So what do higher ed leaders do? They may be involved in providing vision or setting strategic direction, though in higher ed those things are usually set through a long collaborative process where "leaders" often take a behind-the-scenes role. More frequently they resolve disputes, make decisions based on policy, build friends for the institution, serve as a sounding board for faculty and staff, watch over budgets, attend to academic and co-curricular programs, and most importantly, share information that advances the institution. Without a long exposition on each of these roles, I will simply point our that nearly all of these tasks could be done without formal leadership, if the organization and systems and culture were right.
But are they? Almost certainly not. And here is one of the real ironies of higher education. The cultures of colleges and universities are laissez-faire, especially on the academic side of the house. Faculty have wide latitude to teach, grade, select courses, and make plans. (In fact a big part of leadership in HE is to find ways to merge the individual interests of faculty into a more-or-less coherent education for students.)
We often think of this arrangement as an example of leaderless-ness, or at least flat organizations. But it is exactly this culture that makes "academic leaders" necessary, for the laissez-faire culture of higher ed means that little of the work of academic leaders gets done unless someone is assigned to do it. Information doesn't get shared, connections do not get made, programs get overlooked, etc. So the culture of higher education tends to be flat, individualistic, and disconnected; the culture of HE leadership exists to overcome that disconnection.
Now it is worth asking whether the current model of academic leadership enables disconnection or responds to it. Likely both. But given the external requirements on leaders--from donors, parents, students, accreditors, and others; it is unlikely that the changes in campus culture will be able to come from them. So if faculty and staff are interested in reducing the leadership layer, perhaps the first step is for them to find more ways to work together, taking on more of the tasks of leaders so that the need for leaders falls away.
In asking, I am not complaining about any of the leaders at Westminster. Our campus is fortunate to have a leadership slate that is both hard-working and unusually committed to the institution. But it is the case that at one time or another most faculty and staff have wondered about the usefulness of the leadership corps, both here and on every campus. And many have speculated that it is leaders, not faculty and staff, who stand in the way of real innovation and real quality in higher education. So it is worth asking what conditions would be necessary for campuses to imagine there's no leadership.
The business school at Westminster is called the Bill and Vieve Gore School of Business. It is named after Bill and Vieve Gore, alumni of the college and later the founders of W. L. Gore and Associates. Gore is renowned both for its products (Gore-Tex being the most famous) but also for its organization and culture, both of which are designed for innovation. One of the features of the organization is a lack of hierarchy almost unheard-of in corporate America. There are a few "leaders" but most associates play significant leadership roles in proposing, designing, and building products. So there is certainly no hierarchy. (For a great book on organizations like Gore and their strengths, take a look at my friend Jeff Nielsen's The Myth of Leadership: Creating Leaderless Organizations or my colleague Melissa Koerner's blog, "High Performance Organizations." )
When recruiting new MBA students we often run them through a case on Gore, both as an introduction to our pedagogy and to draw a connection between Gore and Associates and the Gore School of Business. A couple of evenings ago I led the case discussion at the recruiting event. Doing so evoked some of my own interests in self-organization and the ways that social movements can emerge without formal leaders. (Take a look at Steven Johnson's Emergence or Jane Jacobs, The Death and Life of Great American Cities, or, if you can find them, my obscure book chapters on educational change, "Neighborhoods and Networks" and "Making Moral Systems of Education.")
That same day, the management faculty and I began to talk about how to select a new chair of their department--the existing chair is taking another assignment at the college. Doing so is no easy thing, since all of the faculty are busy, some have other administrative loads, and few are interested in the combination of tasks that a department chair carries. It is the overlap of these things--talking about Gore and looking for a department chair--that raised the question about whether higher ed institutions could flourish without leaders.
So how do you come at the question of whether colleges and universities need leaders? Two ways come to mind--first by asking what those leaders do that would not happen in their absence, and second by wondering whether colleges and universities have the sorts of cultures that support leaderless-ness.
So what do higher ed leaders do? They may be involved in providing vision or setting strategic direction, though in higher ed those things are usually set through a long collaborative process where "leaders" often take a behind-the-scenes role. More frequently they resolve disputes, make decisions based on policy, build friends for the institution, serve as a sounding board for faculty and staff, watch over budgets, attend to academic and co-curricular programs, and most importantly, share information that advances the institution. Without a long exposition on each of these roles, I will simply point our that nearly all of these tasks could be done without formal leadership, if the organization and systems and culture were right.
But are they? Almost certainly not. And here is one of the real ironies of higher education. The cultures of colleges and universities are laissez-faire, especially on the academic side of the house. Faculty have wide latitude to teach, grade, select courses, and make plans. (In fact a big part of leadership in HE is to find ways to merge the individual interests of faculty into a more-or-less coherent education for students.)
We often think of this arrangement as an example of leaderless-ness, or at least flat organizations. But it is exactly this culture that makes "academic leaders" necessary, for the laissez-faire culture of higher ed means that little of the work of academic leaders gets done unless someone is assigned to do it. Information doesn't get shared, connections do not get made, programs get overlooked, etc. So the culture of higher education tends to be flat, individualistic, and disconnected; the culture of HE leadership exists to overcome that disconnection.
Now it is worth asking whether the current model of academic leadership enables disconnection or responds to it. Likely both. But given the external requirements on leaders--from donors, parents, students, accreditors, and others; it is unlikely that the changes in campus culture will be able to come from them. So if faculty and staff are interested in reducing the leadership layer, perhaps the first step is for them to find more ways to work together, taking on more of the tasks of leaders so that the need for leaders falls away.
Labels:
innovation,
leadership,
questioning assumptions,
system reform
Subscribe to:
Posts (Atom)