Emma Goldman, political cartoon, From the Yiddish Press c. 1901
Some new women’s history resources are available for you!
Women’s History Research in Archives at the University of Wisconsin has a new guide out that allows researchers to access a large number of digital primary sources. Topics include:
women in the armed forces
The bookmobile 1931 -1940. The 1931 Dodge was "manned" by two ladies at all times: one to drive and one to stand on the running board to keep it from tipping over. Proper attire included "a long-sleeved dress, a broad brimmed hat and gloves" to prevent tanning.
Recruitment poster for the Women's Army Corps (WAC) dated 1965, printed in green and black, and featuring an illustration of a woman in a WAC uniform. The Betty H. Carter Women Veterans Historical Project.
A friend of mine sent me this slideshow from a presentation she attended at SXSW called “Universities in the ‘Free’ Era.” Even without the accompanying lecture, the slideshow raises some interesting points about how we can change the structures of formal education to accommodate alternate learning styles and new technologies. I don’t necessarily agree with all the points raised here (I don’t advocate abolishing academic departments, as I see merit in maintaining the modernist structure that a “History” or “Philosophy” department provides, allowing students to immerse themselves in the theories, historiography, and current debates of the field. However, we absolutely should not be limited by such demarcations. The brilliance of interdisciplinary fields is the ability to understand the intersectionality of theories and experiences across disciplines. You’re not going to be able to truly engage with issues of the Study of Women and Gender unless you are exposed to and actively engage with with issues in the fields of history, Afro-American, Latina/o, Asian-American studies, economics, biology, law, etc. Most interdisciplinary departments understand this and are rooted in a belief in the importance of teaching this interconnectedness.), but it’s worth checking out.
Unlike some 95% of my cohort, I did not grow up with them. Even if they had been in wide use during my formative years in the ’80s/’90s, my family couldn’t afford one. The first time I ever saw the internet in action was during my senior year in high school, when a group of friends looked up Ralph Wiggum sound clips in the library – an event so unusual at the time it still clearly stands out in my memory. I didn’t own a computer, an ipod, or a mobile phone until 2006. I added the option to text to my phone plan last year. I am a product of research trips to the library, fliers, payphones, handwritten letters, and zines.
Computers were just not part of my reality. At all.
But it is no longer the mid ’90s. As a grad student living in an international metropolitan area in 2010, I have accepted my utter dependence upon them to apply for jobs, register for classes, locate directions, and find out when the next train is coming. I begrudgingly created a Facebook account in college to accommodate my undergraduate and tech-savvy friends, and as of last week, as a class requirement, I have a Twitter account. I’m no longer intimidated by computers, having used them frequently in my work and research over the past 5 years, but I’m still ambivalent about the effect they have had (and continue to have) on society. While I still have enough hold-outs in my life who refuse to own a mobile phone or car or email account to convince me we haven’t all sold our ability to be independent beings along with our souls to the computer gods, they are becoming rarer and rarer, and I’ve noticed my reactions to these lovable Luddites shifting, from righteous “Right on, sister,” to “Ok, I’m about to buy you a phone to make my life easier.”
My primary critique with this new social media and web 2.0 lifestyle is that the ways in which people interacted with each other and with in society in general in the past seems to be disappearing – and few people seem concerned by this. Now, I am enough of a past-obsessed dork to be pursuing a PhD in the subject, so I freely admit my bias, but seeing a group of teenagers congregating around a table with their eyes glued to their phones or laptops makes me incredulous. Aren’t they irritated that their friends are ignoring them? Are they talking to each other on those things?!? It reminds me of that scene in Cluelesswhen Dee and Cher can’t wait the 30 seconds it takes them to see each other in the hall between classes so they whip out their phones and continue to talk to each other even as they’re walking next to each other. Don’t people realize that that scene was a joke, a parody making fun of that kind of behavior, and not something to emulate?
It’s not just that this tech-cult[ure] seems to be breeding rudeness and self-centeredness; those qualities are hardly unique to the 21st century (or teenagers, or Americans for that matter). I think the issue is deeper than that. The thing that most concerns me is that we now have a generation of teenagers and young adults for whom instant gratification and knowledge is all they’ve ever known. There are people out there who haven’t experienced being driven by intense curiosity into action to research a topic themselves because they HAVE to know about it. They haven’t scoured the books and encyclopedias at the local library for information on silent films or Bessie Smith or how bicycles work because they heard/read/saw something about the topic three days ago and its been on their mind since then and they can’t stop thinking about it until they learn more. They haven’t tried to figure out the lyrics to “Smells Like Teen Spirit” by listening to and rewinding the tape over and over until they think, but aren’t quite sure, they have it (“ok, ok, mosquito I get, but what’s an albino?”). The process of researching a topic from numerous sources, at the library or wherever else, exposes us to multiple perspectives, providing us with the skills to synthesize the information and come to a conclusion ourselves, instead of immediately running to one website for an answer.
There seems to be this expectation now that everything should be available on demand, instantly. Why should we wait for anything? Well, because in life we wait, often for long periods of time, and often for unexpected or undesired results. And most of us are aware of this, because we have experienced the waiting: yes, it requires patience, but it also sparks ideas and desires in us that help us understand where our passions lay. During this waiting period we also tend to talk to others about the topic on our minds, again exposing ourselves to a greater variety of information and perspectives. And now there is a generation of people entering the work force who have not experienced this type of delayed gratification, and who can’t imagine not having what information they want rightthissecond.
But as I’ve been poking around on and talking to my friends about Twitter, I’ve begun to step back from my instant reaction of “Great, another way we can maintain superficial relationships while avoiding actually talking to each other in person.” What I’ve begun to understand about Twitter, and digital media in general, is how it is being used not only for social interaction, but for intellectual engagement as well. I think it’s pretty exciting to see how digital technologies are queering the traditional structures of education. From following the musings of academics working on their research on Twitter, to watching U.S. Army propaganda cartoons from the 1940s on the National Archives’ YouTube channel, to browsing through digital documents about the Y.W.C.A., experiencing history is becoming a part of deconstructing traditional scholarship.
To be clear, this new form of history is still a class issue. Many, many people don’t have access to computers, or can’t take time off work to go to cultural or historic events, or don’t have museums (or free museums) in their areas. Lots of children are still learning about the past only via television and movies, while the educational experiences of others consists of being taught by the high school coach with no training or particular interest in history who reads out of the teacher’s edition of an outdated textbook. Access to information is not equal, no matter how much open-source proponents and web 2.0 advocates might claim it is. But perhaps instead of dismissing the technology altogether and focusing solely on external social inequalities, we should be utilizing these new sources to challenge not only the structures in which we teach and learn, but also challenge which audiences we’re engaging.
I’m no longer scared of computers or technology. I’m certainly not going to stop writing letters or taking photos on film, but I also think the idea of having GPS on your phone is brilliant, and I love the ability to share the amazing thoughts of people I know with the rest of my social network. So going forward, I am making an effort to be a little less Kill Yr Television and a little more tolerant of our YouTube/Twitter/Facebook culture by actively becoming a part of it and finding ways to teach and learn through it.
update: 1.31.10 Hee, xkcd gets it, with dinosaur puns to boot.