"For [the theoreticians of photography] undertook nothing less than to legitimize the photographer before the very tribunal he was in the process of overturning." -Benjamin, Little History of Photography I want to explicate some of the issues I raised in the last post, address some of the comments, walk back my position on at least one point (yes you are all right the word "bad" was not a fair characterization), and dig in on a few others.To keep these posts stylistically similar let me again start with two observations. 1. One of the essays I most enjoy teaching in my media studies classes is Benjamin’s The Work of Art in the Age of Mechanical Reproduction. When teaching this essay I often begin the class by saying Benjamin understood why Ebert was wrong. That is Ebert, rather famously claimed that while video games might demonstrate a high level of craft, they will never rise to the level of art. Of course what Benjamin argued in The Work of Art, at the time in relation to photography, was that the question should not be "Is Photography Art?" but rather the more important question: "What does having photography do to our concept of art?" (By extension the question of video games should be what does having video games do to our concept of art.) This is similar to how I think about the concept of digital humanities. I think we should not be asking, can the humanities be digital, or how does the digital allow or not allow us to do humanities, but rather, what does having the digital do to our idea of the humanities (and by extension what it means to be human). Anything short of this strikes me as less than interesting, but more importantly a missed opportunity. 2. Okay, I can tell I am really going to get in trouble for this one but . . . The following is not originally my observation, I wish I could take credit for it as I generally agree and think it is really astute, but it’s not mine. (But I will let the original source remain anonymous as it was an "off the record conversation," but if said person wants to claim it, I will note credit here.) Generally speaking (painting really broad but accurate brush strokes here) Digital Historians, and Digital Literary Scholars have had significantly different approaches to incorporating "the digital" into their respective scholarship. Digital Historians have leveraged the digital to expand and engage a wider public in the work of history. As examples of this think of Omeka, or leveraging social media to engage in crowd sourced projects. That is, Digital Historians have often begun by asking "how does the digital allow us to reach a larger/public audience?" Now this could be because many of the folks working in Digital History come from a public history background . . . But in the case of literary studies the "digital" projects have not, as much, changed the scope of the audience. So that if you look at digital literary projects they often look remarkably similar to projects in the pre-digital era, just ones which have been put on steroids and run thru a computational process. Seems to me that the Digital Historian model is a better one. Okay so onto the post. . . I can’t help but notice that most of the talk, or at least critique, in the comments centers around the last paragraph, largely ignoring the analysis which led me to that paragraph. (To be fair I sort of invite this, saving my central and controversial claims for that section, but still . . .) That is, the early part of the post has as its supposition that "Universities are still valuing the wrong stuff," and by Universities I mostly arguing about humanities scholars, but that’s only because the context was the MLA. When I look at what type of digital scholarship in the humanities is being recognized and valued by the institutions within which we operate it seems that that scholarship is mostly conservative, does little to question, upset, or threaten the dominant paradigms. And, that what I see to be as truly important work has yet to receive recognition. The fact that someone like Brian can be without a job and largely a "real nobody" while he is such a significant "virtual somebody" is just one example of this. In his comment on the original post Tim Lepczyk suggests that a large part of the problem here is in defining what I, or anyone, means by the digital humanities, or humanities 2.0. I think this is spot on, and this is probably one of the most slippery parts of my argument, one I haven’t entirely worked out. As he points out there has been a certain amount of baggage from prior text analysis that is ported over in the upgrade to digital humanities. I definitely see humanities scholars as collaborating with computer scholars, IT folks, and people from a range of places within the academy and outside the academy. (Indeed one of my favorite presentations at the MLA addressed one particularly thorny aspect of this issue, @nowviskie’stake on intellectual property and labor in the age of collaboration.) But I think if what the digital does is just take the old disciplines and make them digital, leaving disciplinarity and the silo structure of the University in tact, it will have failed. I want to see the digital transform not just the content or practice of the disciplines, but the very idea of disciplinarity. But, it is not entirely true as Brian Breman argues that I am advocating a "this changes everything," approach to the digital humanities. In fact my major fear, the thing that keeps me up at night, is the idea that "this changes nothing." Indeed that was the impetuous for the original post, despite the digital, nothing changes. It seems to me that the digital affords us (both as academics and as a wider members of a society) to do something really different, to re-organize many of the founding assumptions we have about how to organize knowledge, how to organize people, and even the nature of what it means to be human. But, I see us not necessarily taking advantage of this opportunity. In fact I see this as a fading opportunity, as our culture makes the "change over" from one intellectual substructure (dead tree) to another (digital network) it seems that we are porting over a host of prejudices about knowledge production and dissemination that are worth rethinking. (As just one example of this I think about intellectual property and knowledge ownership.) So, I would love if "this changes everything," but unfortunately I think (as my original post claimed) that this has changed little, especially within the walls of academia. This is not to suggest that there are not some significant revolutions/projects taking place both within and outside of academia, but that a lot of what is being done/counting as digital scholarship does little to question the founding principles of academic knowledge production, especially within the field of "literary studies" (principles which we can at this moment, perhaps, but for a very short time re-negotiate). On the most radical I’ll raise the question this way: The rate at which some of the digital scholarship has been so smoothly/effortlessly incorporated into the walls of the academia should perhaps give us pause to question whether or not it actually signals any change at all. Again to paint broad brushstrokes, but ones which I think are relatively accurate, scholarship tends to fall into two categories: 1. That which does little to call into question the walls of the ivory tower, or what is worse strengthens those walls, a digital humanism which would build an ivory tower of bricks and mortar and supercomputers crunching large amounts of textual data producing more and more textual analysis that seems even more and more removed from the public which the academy says it serves re-inscribing and re-enforcing a very conservative form of humanities scholarship. 2. A digital humanism which takes down those walls and claims a new space for scholarship and public intellectualism. Now while these two positions are not as mutually exclusive as I am painting them here I am more than willing to sacrifice the first for the sake of the later. In the longest comment on the last post, @mkirschenbaum, suggests that when we think about the internet we need to think not about the Derrida of The Postcard or Of Grammatology, but rather the Derrida of Given Time. This is perhaps the most succinct phrasing I have heard of the problem. We spend too much time thinking about the structure of the link or data and not enough time thinking about the social relations and ethical questions opened up by this space. And in this regard I agree with in part @sramsay’s comment that "new tools can facilitate a new type of public intellectualism." The printing press was not just a faster version of the scriptorium, it was the "gadgets of the early modern period and the networks of communication in which they flourished" that changed the intellectual and wider cultural landscape. The printing press was not a mere tool by any means. But, it was precisely at the level beyond the printing press as gadget that I want to look, and to which I think we need to focus our efforts. On one level the printing press was just a gadget and the real, the important change, came at the level of the social negotiation about how that gadget would be deployed. Authorship, intellectual property, authority, piracy, etc. were all social/legal/cultural negotiations that occurred and were not decided at the level of the gadget, even if the gadget did speed up the rate of connectivity. If academic scholarship, just to take one example, says "what can I author now on the web," without first calling into question the notion of "authorship" and recognizing the degree to which it might be heterogenous to the way knowledge can be organized on the web we will have missed a golden opportunity. I think I should have been perhaps clearer, or not so glib in my paraphrasing of the question from my panel. I think to say that it was a "bad" question was wrong. What I should have said was that I think to answer the question straight up is not the most productive way to look at the problem. Instead by answering the question backwards, saying what if we thought about the "digital" as not merely an adjective (gadget to be applied to the humanities) but something much more, what does having the digital do to our conception of the humanities, seems to me the place we should place our focus. And so this is where I am really going to dig in. @tanyaclement, correctly so, calls my analysis out, saying that like the MLA I am perhaps focusing too much on social media, "Clearly, there has been a lot of focus on "Digital Humanities" this year because of the rise of twitter and, as such, DH has now been associated with social media almost exclusively. This is unfortunate." Where I am going to disagree with this is at the level of "unfortunate." I think this is a fortunate thing (if only it were the case). The more digital humanities associates itself with social media the better off it will be. Not because social media is the only way to do digital scholarship, but because I think social media is the only way to do scholarship period. Yes it is true that there are hosts of scholars having scholarly discussions who are not on Twitter, but you know what, they better be, or they risk being made irrelevant. No this doesn’t mean that every scholar has to have a Twitter account, but it probably wouldn’t hurt, but it does mean that every scholar better be having their discussions in public on the web in these digital spaces for all to participate in. I realize that this stance displays a certain amount of irreverence to the very people on whose shoulders which I stand in order to make this argument, but on the same time it displays a hyper-fidelity to their work, thinking about how it can be carried into this new digital substructure, used to shape this (perhaps) new way or organizing knowledge. Yesterday this argument took a different sort of turn when Ian Bogost published The Turtlenecked Hairshirt: Fetid and Fragrant Futures for the Humanities. In part Bogost was weighing in on the question of Digital Humanities and its arrival, non-arrival, but was actually, it seems to me, making a much broader critique. Regardless, as he observes in the comments on the post, much of the discussion centers around a conflict between digital humanities and new media. Along these lines Matt asked if this is not just a debate over semantics, and perhaps less generously, a territorial pissing match. Throwing around the term "digital humanities" as an empty signifier, backlash against the digital humanities. Let me be clear, I have no desire to engage in an academic territorialization argument. Honestly I couldn’t care less, having left an English department I am quite happy to not have to engage in those discussions. My position was a much larger one, addressing the question of whether or not "digital humanities" has arrived, and in a connected manner what this means for the future of the humanities. It appeared to me that much of the discussion at MLA was about the arrival of the "digital humanities" and in a related theme the extent to which this can serve as a "cure" (as Ian puts it) for what ails the humanities. So let me put it a different way, maybe the digital humanities has arrived, maybe it is becoming central and important in the way that humanities scholars do their work, but the digital humanities that has arrived (the slow work that @tanyaclement mentions) is the kind of arrival that changes nothing, a non-event. The only type of digital humanities that is allowed to arrive it the type that leaves the work of humanities scholars unchanged. Seriously, don’t tell me your project on using computers to "tag up Milton" is the new bold cutting edge future of humanities, or if it is the future of the humanities it is a future in which the humanities becomes increasingly irrelevant and faculty continue to complain at boorish parties how society marginalizes them, all the while reveling in said marginalization, wearing it as a badge of honor which purportedly proves their superiority on all matters cultural. As Ian observes, "It’s not "the digital" that marks the future of the humanities, it’s what things digital point to: a great outdoors. A real world. A world of humans, things, and ideas." That is what I was after in my original post, the idea that the digital that I am hoping for, hoping will challenge and change scholarship hasn’t arrived yet, for all the self congratulation about the rise of the digital, little if anything has changed. Humanists are still largely irrelevant in the broader culture discussions, and it seems to me purposely chose to remain so.(Actually I am not certain the degree to which this is really about "literary" humanists, as it seems this issue plays out differently in history. But that might just be the perspective of an outsider.) And this is the brilliance of Brian’s paper (content not withstanding) he made his material more relevant than all the other papers that weren’t published, he engaged the outside (even if it was a paper that was a lot of inside baseball on the workings of the academy) because he opened his analysis and thinking to a wider audience (and as @amandafrench and @bitchphd remark did it with a real-time spin that enhanced at both the level of content and delivery). Again The real influence should be measured by how many people read his paper, who didn’t attend the MLA. Or maybe, the real influence of his paper should be measured by how many non-academics read his paper. Scholars need to be online or be irrelevant, because our future depends upon it, but more importantly the future of how knowledge production dissemination takes place in the broader culture will be determined by it.
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:32am</span>
The following is a guest post from UT-Dallas graduate student, Barbara Vance (@brvance). This past semester Barbara taught an atypical rhetoric and composition course. Barbara teaches Rhetoric 1302, the standard introductory college writing course. She was given a course with a group of students who she was told, were struggling with writing and needed, "more structure." As a response Barbara did the smart thing, and actually gave the students more freedom and control over their education. I’ll quickly summarize, and then get out of the way and let Barbara tell the story. Essentially, Barbara turned the class into a documentary production class where the students spent the semester producing a film, working collaboratively on one project. Where is the writing you ask? Well read on, but Barbara had them write about their experiences the whole time, giving them a reason and context to write. The results are pretty amazing. The post is a bit on the long side, but worth the read as Barbara covers not only the "what" but the "why." Also check out the two embedded video the one below is the video from the students, and at the end is an interview with Barbara. This is a bold, risky approach, especially given Barbara’s status as a graduate student, not tenured faculty, but I think if college rhetoric and indeed college education is to remain relevant over the coming years this is the type of experimentation and adaptation that will be necessary. Click To Play The Internet has fundamentally changed not only the means through which we communicate, but also how we communicate and how we think. It has, in turn, altered what others expect from our writing, what employers look for in applicants, and how we conceive of work that used to be private. One need only look at the blog explosion to see how the ability to disseminate our thoughts cheaply and quickly, and to develop a dialogue with others empowered thousands to believe their voice was/is worth sharing. Teachers cannot ignore this communication shift. A Kindle is more than a paperless book: it changes how we read, how we define reading, and how we perceive intellectual ownership. As society continues down a path toward ever-increasing mobile communication, our conceptions of how we persuade will also change. I think few Rhetoric instructors would argue with the idea that students should be able to not only consume information, something they’ve been doing their entire lives, but also to produce it. But as it stands now, most rhetoric courses focus strictly on writing, and they limit assignments to the classroom environment - practices that devalue other rhetorical mediums, and the purpose of rhetoric itself. It is with this spirit in mind that I designed my special topics Fall 2009 freshman rhetoric course at the University of Texas at Dallas. I wanted to transform the traditional rhetoric class with its standard textbook into a more relevant, new-media oriented course that focused not only on writing and speaking, but one that also looked at rhetoric in film, photography and music. To that end, I designed the course to include a live WordPress blog on which students could speak to each other and anyone else in the world who cared to listen. A website containing copies of their larger papers coincided with the blog. This made the assignments more communal in nature and reinforced that writing is meant to be shared. In a more traditional classroom environment, students write only for the teacher, an approach that makes assignments seem less relevant to the students and devalues the very idea of rhetoric. Requiring students to blog, contact people outside their classroom, and post writing on the Internet teaches them to engage with the community, gives their writing more significance, and supports rhetoric - a term that, by definition, implies community. While this public exposure to their work can be intimidating for some students, it forces them to take more accountability for their words while teaching them the power of communication. If they embrace it, students can develop a sense of freedom and power that resides in someone who feels comfortable with both the tools of communication and also the arenas that currently dominate the conversation. Right now, a majority of the conversations are increasingly happening online. Students must know how to navigate these waters. It is a direction more and more university rhetoric departments are going toward, including Ohio State University, which has some excellent examples of class blogs. A strictly digital approach is not for everyone. I will always prefer a paper book, believe memorizing grammar rules is essential, and don’t think everyone needs a blog. Nonetheless, these are issues students should be aware of. Creating work in a vacuum delegitimizes it. When the goal of your course is to teach students to persuade, and you don’t include what is now the most influential tool for disseminating your argument, you are crippling your students. Writing and reading online is different than performing those same tasks on paper. We communicate differently on the Internet, and as more and more people read from their phones and portable e-readers, our understanding of communication will change further still. As technology shifts, so does our means of persuasion; if students do not explore this, they will find their skills quickly out of date. Rhetoric is more than just learning a standard structure for an argument. Students should be asking themselves: "How does what we write and what we think change when we know that in ten minutes we can create a blog and broadcast to the world? How does this change how we see and portray ourselves?" These are the deeper rhetorical questions students need to grapple with. It is this focus that will make them stronger readers, writers, and citizens. The second media-based aspect of the course was centering the writing assignments around a film that the students would produce. My goal was that this would provide continuity between assignments, while reinforcing one of the fundamental ideas underlying this class: rhetoric is found in a variety of media, not just writing. Many rhetoric programs devote time to "visual rhetoric," but it is often cursory at best and culminates in a short essay examining a film or piece of art. While I do not object to this method, I was always bothered that writing was still given precedent over the image. We tell students that pictures are a viable means of persuasion, and then we as them to write about it. This hardly reinforces the message. So I thought: "Why not have the students work with the mediums they study, including film?" I "hired" each student for a position in the "company" based on his skills and interests with the idea that this would not only hold their interest, but also be quite germane to their course of study. Everyone had to apply for their job, writing a cover letter and resume, and having a personal interview with me. Students were never entirely on their own, as the positions were part of large groups: pre-production, post-production, marketing, and web design. Throughout the semester we discussed the various rhetorical aspects that comprise a film - including text, images, music, and sound effects - focusing on how and why creators made the decisions they did. Always, the emphasis was on these crafts as rhetorical devices. The end result was a website and corresponding film, created by the students and comprised of their work throughout the semester. Overall, I have found it a fun, effective approach. An added benefit of the film was that it captured the students’ interest, as did broadcasting their work on their website, www.rvuentertainment.com. They became so invested in the film that the writing pertaining to it took on new meaning. The first essay required them to identify an issue in their local community and write about it. From these, the students voted on which would be made into a film. The second major writing assignment was a visual essay in which the students each described how they would make the film, supporting their paper with images they found online or took themselves. In addition to these, smaller assignments were given to each student based on his role in the company, including reports, marketing letters, short essays on artists who inspired them, and storyboards. All students were also required to blog weekly. The students really took to the project and, barring the procrastination that is a given for many college freshman, they handled it well. Weekly student-run meetings in class kept everyone on the same page and let me know where things stood. There were also individual meetings in which I worked one-on-one or in small groups to help them with their respective roles. I admit, I had my doubts. Coming from a traditional writing background, and considering the departments goals, I felt the focus of the class should remain on writing aptitude, and the one constant question rolling around my head all semester was: "Are you doing the students an injustice? Are you taking time away from writing skills to focus on film, sound, and these "alternate" methods of persuasion?" I think my fears were reasonable, but ultimately the class worked out well. Because so many rhetorical devices remain constant across mediums, teaching students how pacing working in screen cuts or music only reinforces how it could be employed in their writing. Overall, I think the class was a success. It taught the students to work with a variety of mediums and to always consider their work as something to share. It is this final point that the entire course hinged on: community. The blog, the group film - everything the students - did was about engaging the world, establishing a presence, and utilizing the tools that the rest of the world is operating with, rather than limiting them to traditional print-based technology. Here is an interview about the project with Barbara.
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:30am</span>
Last week I sent Michelle Nickerson, a colleague of mine here at UT-Dallas , a link to Dan Brown’s "Open Letter to Educators." Michelle like me, is concerned about the future of the University, and as someone whose opinion I respect, I wanted to see her response. After watching it we swapped emails back and forth about Dan’s video, at one point Michelle asked if I was going to write about it for this blog, to which I responded "how about you write about it and I’ll post it." So, the following is Michelle’s thoughts on Dan Brown’s piece. I don’t entirely agree, but this is a good jumping off point. Let the conversation begin. University administrators and faculty should pay attention to the message of Dan Brown’s "Open Letter to Educators." Students need to ask themselves, as Brown does: "What does it mean to receive an education?" Brown’s most important observation is how the university, as an institution, is failing to change in ways that make it relevant to what he describes as "a very real revolution." He notes that technologies popular in higher education today—like email, on-line databases, and blackboard—represent minor adjustments that fall woefully behind the curve of the real sea changes threatening to undo "the University" as an institution of learning. Brown, moreover, correctly identities how shifting class relations challenge the current structures of higher education. I agree that the internet has, in many ways, proven itself a democratizing force in our society and many others. Brown’s limited insight, however- contained as it is in his box of "information"-prevents him from seeing numerous other layers to this problem. I will talk about one. The university, as a concept, could very well disappear just like Brown predicts…for many Americans, but not for all. As institutions of higher learning seek ways to economize by eliminating and devaluing the spaces of learning that have been so central to "the University," they are coming to resemble exactly what Dan Brown sees in them—exchange sites of information, marketplaces easily replaced by much cheaper flows of information accessed on the internet. As they pack more students into lecture halls and fill the rosters of on-line classrooms, universities save billions of dollars in the short run, but diminish the value of their degrees. Classrooms and other spaces in the university lose their meaning in this race to the bottom. The competition for more bodies per professor, however, does not threat the university as a concept. This is where Dan Brown’s class analysis could use some help. The "State University"—specifically, the notion of affordable education is eroding. Financial and intellectual elites (rich people and academic-types) tend to be suspicious of each other, but one thing they seem to agree on is what the space of the University represents, and they will not stop paying for it…they will continue to pay hundreds of thousands of dollars to send their children to ivy league universities and small private liberal arts colleges. Princes and sheiks in foreign countries will continue packing their children off to the United States for higher education. These spaces, since they come at a very high price, are rarified worlds that diverge ever more from that of state universities. Administrators of these universities know that parents aren’t paying to send their children to these expensive schools for "information." They are sending their children to become the producers, manipulators, and interpreters of information. When university classrooms, libraries, courtyards, and student commons are designed and utilized to their greatest effectiveness, they become spaces where students learn not for the sake of absorption (passively), but for the sake of generating new knowledge, developing new conceptual models, discovering new worlds of meaning not introduced by their professors. The professor to student ratio is critical in this respect, because the professor-as-critic-and-listener is just as important, if not more important, than the professor as instructor. I therefore recommend that viewers heed Dan Brown’s "Open Letter to Educators," but think more carefully about what is disappearing with the university. And for what it’s worth here is the video that sparked this conversation.
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:29am</span>
So, I have been borrowing an iPad for the last couple of weeks. I realize given my critique of the device that it might seem a bit bizarre for me to be using one. But, I consider it research, a way to have an informed position, and since this is really one of our lab computers, I didn’t have to purchase one. I have been trying to use it for everything I need a computer to do, forcing myself to use it over my laptop. What follows is my now informed researched critique of the iPad. My initial thought: I would pay $1000 for one of these tomorrow, but only if they unlocked the damn thing. This (I am typing on it now, more on this later) is perhaps the most frustrating computer experience I have ever had. Frustrating not because the iPad is difficult to use, it is anything but. Rather, it is frustrating because it is such an artificially unnecessarily crippled device. Or as I have said to those who have seen me carrying one around, "It’s like being given a Ferrari, only to discover that is has been equipped with a VW bug engine." The iPad looks nice, and shows what is possible, but only shows, never really delivers. Like I posited earlier this is an appliance not a computer, but if this was open, operating a full OS . . . In this respect I think David Pogue’s schizophrenic review, one I sort of initially thought was a little cheeky, to clever by half, is pretty much dead on. If you buy one wanting a computer you will be disappointed, but if you buy one wanting a device for consuming all your digital content, it is well worth the price tag. Consider it is a video game platform, an ebook reader, and a way to surf the web. Quite a bargain in some respect. (But, and I stress this is strictly a consumption based device right now, you really have to fight it to use it to create and compose.) On being an ebook reader: Initially I thought that the iPad with it’s backlit screen could not compete with the Kindle or Sony eReader and eink, but after using the iPad I think the difference is not all that large. I have read for several hours at a stretch on the iPad and it doesn’t produce the eye strain I am used to associating with screen reading (Instapaper was one of my favorite uses of the iPad). To be sure, eink is still better, but the difference is nowhere near as large as I expected. Add to that the much easier (theoretically) ability to annotate your reading, and I could for see a future where I carry a slate style computer around to do most of my reading, especially journal articles and student papers. Furthermore the ability to do creative things, think beyond the book, embed video, dynamically update, nonlinear presentation, makes it promosing. I downloaded one "instructional app" a Stastics program that is textbook, plus quizzes etc, and it definitely points to a future for class content distribution that is much better than the current model. Plus I could carry around all my student papers, syllabi, important documents in one small form object. I do this already on the iPhone via Dropbox (minus the student paper part) but having it on a larger screen would make them far more useable. With an iPad I could truly go paperless. Interface: This is where the iPad really shines. Multi-touch screen interface changes the way you interact with a computer. Sitting down at a computer with a mouse and a keyboard just seems primitive now. The web surfing experience is so vastly superior. It’s honestly difficult to describe, the zoom in zoom out, slide objects around tactile nature of viewing. The iPad begins to change not only he way you interact with he web, but what can be done in terms of design and presentation. The best way to describe this is think Minority Report (note to Apple Minority Report serves as a proof of prior art so don’t be assholes and try and patent all of his). Very few applications have taken advantage of this yet, but the ones that harness the power of multi-touch really are a different sort of experience. I have been using iThought for mindmapping lately and there is a huge difference between clicking on a branch and moving it (as in with Nova Mind or other desktop based applications) and actually grabbing/touching the branch and moving it to where you want. The future is in touch screen interfaces, and I can’t wait for more of them. Keyboard The keyboard is not bad, I can use it for most of my typing. I am still slower than on a laptop with a full keyboard, but getting better, and I am sure I could retrain myself given another month or so. I also think that a case which would prop it up a bit or using the external keyboard could help. Certainly the keyboard would not limit me from using this as my primary computer, especially if I kept a full size keyboard at work for long composition, but I did write this whole blog post on the iPad. Battery life: Battery life is wicked good. I can easily go a whole day without charging it, more like two days. Data: This is where the iPad really sucks. There is no desktop, no place to store all of your data. For example if you want to build a Keynote presentation (the Keynote app is horribly crippled by he way, many of the features I am used to are not there) this can be incredibly frustrating. So you are in Keynote and you want a picture for your slide. You have to exit Keynote, go over to Safari open it up find the photo you want, copy it (if you want more than one you have to save them to iPhoto, if it is jut one you can save it to the clipboard), close Safari, go to Keynote and import the picture from the clipboard or iPhoto. Now say you need to give credit for the photo, you have to close Keynote again open back up Safari copy the URL, close Safari, open Keynote back up and then paste the URL into your credits slide. Seriously frustrating. I know the next release of the OS promises to allow multi-tasking, but the real issue here is not having a desktop to which you can save all the images, video, text, etc, you want. Or an open design platform so somebody could design me a clipboard with a 50 item cache. Applications for the most part can’t talk to each other and can’t pass data back and forth. So you have to develop all of these work arounds to have access to files. Right now the best way i think is thru Dropbox, but your Keynote presentation can’t save to Dropbox it can only save locally. So, you have to email it to yourself, and then from your home computer upload it to Dropbox. See, ridiculous, frustrating. Locked Out: This above is really a problem because of the way the iPad is locked down, you can only have apps which Apple wants you to have (can we talk about the fact that Apple denied a cartoonist application because it might be offensive, do we really want one company building that kind of media influence). I get what Steven Johnson is saying, that the device can be seen as generative, that the app store provides a certain amount of stability and funding guarantee for developers. So that what we have seen is an incredible explosion of iPhone apps, which is likely to be reproduced on the iPad. The problem with Johnson’s argument is that an open system is not mutually exclusive with an app store. Apple could provide an app store for the iPad, one with safe approved apps, and still allow others to install apps they didn’t get from the app store. This is how the iTunes music works. You can by songs from Apple, from Amazon, or upload your own, all of which iTunes can handle. Apple as large media conglomerate, hardware and software distributor scares me. How many people would leave their Apple’s behind if Jobs went to a App Store model for laptops and desktops? Many of my favorite Mac apps are ones that probably would not have gotten approved. What developers would build for the iPad will no doubt be amazing, and this for sometime will probably continue to drive popularity, but also developers might start to balk at Apples tight control. I really want to see what developers could do if they had root access to this thing, my guess it would be pretty f’in amazing. What’s Next: I wont be buying one. I am going to wait. Having said that, I think if I was a developer or teaching web design directly I would. Why? Because it really changes the way you can compute and having a device that provokes this type of thinking is useful, a device that points to the future. But I still stand by the fact that I wouldn’t want these for my students as their computing devices. I would hate to see what type of student would develop if this were their only or primary means of computing. Instead I am holding out hope that the the competitors will will quickly get an open one to market. As for me I am going to go learn android so when a slate running android gets to the market Ill be ready to use it as my primary device.
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:28am</span>
Yesterday, Dan Cohen’s tweet about the iPad and censorship, got me thinking about a drawback to the iPad for education argument. What Dan made me wonder/realize is that by using iPads for educational purposes schools, both higher ed and secondary/primary ed, would be opening themselves up to censorship by Apple. In other words as I tweeted this morning: Consider, that Apple’s track record here is not all that great. The way currently the App Store is administered, applications have to receive approval from Apple to be listed. Now supposedly this was initially done for quality assurance purposes (to make sure apps won’t crash your device) and in limited cases to insure that apps don’t duplicate existing core apps (listening to music, email) or interfere with AT&Ts money interest. But as the app store developed Apple extended their approval process into the role of censorship. From Apple’s Program License Agreement: "Applications may be rejected if they contain content or materials of any kind (text, graphics, images, photographs, sounds, etc.) that in Apple’s reasonable judgement may be found objectionable, for example, materials that may be considered obscene, pornographic, or defamatory." So, Apple might block anything that in their reasonable judgement they think is "obscene, pornographic, or defamatory." This as far as I am concerned is a dangerous situation, Apple as moral censor. Now certainly it is within their legal rights to do so, but the question is whether or not it is a good idea for us to enter this contract (and by us I mean both users and developers). Most famously this restriction affected developers of "pornographic content" with Wobble being one of the more hyped, removed, reinstated apps. This also means that the range of iPhone sex apps must have stick figures rather than more illustrative pictures. So, say for instance you are teaching a course on human sexuality, or a sex education course, is Apple going to restrict what you can and can’t do with the iPad content wise? Okay you might be thinking this is a liminal case, teaching sex in schools is always a touchy subject and Apple will be necessarily treading on shaky ground here. I think most people probably feel no threat from Apple as long as they limit their censorship to "pornographic content," but as their policy indicates it extends further than that. There is political content that Apple not only would be willing to censor, but has already censored. (Worried yet?) The at this point most famous case of political censorship by Apple is of Mark Fiore, who won the Pulitzer Prize for his political cartoons. His app was censored by Apple. Now upon him winning the Pulitzer his app has subsequently been made available, but a situation where someone has to win a major award to overcome Apple’s censorship doesn’t exactly strike me as conducive to intellectual discourse. Now consider the possible futures. Will Apple censor political apps that one might want to use in your classroom? What happens when Apple goes international with the iPad in education movement? Will the German laws restricting what can and can’t be said about Nazism limit what content Apple makes available? What about in China? Currently this is not an issue because the devices we use are independent from the content (or at least with respect to most computers), the company doesn’t get a say in how I use their device. This initially might not seem like a big concern, for as many people pointed out on Twitter today, Apple is not going to censor documents that one accesses on the iPad, Apple only restricts what applications you can run on their devices. So presumably one could buy an ebook reader app for the iPad and run any Textbook that is published in ePub through the reader, Apple will have no say in the matter. But as Dan’s Tweet points out this is a concern. For in the first place many books are published as apps so they will not get a work around. Especially with regard to textbooks which are likely to be published as apps requiring updates every year, following the software leasing model, rather than purchase a song model (textbook industries will love this as it yields greater revenue). Or as many of the educational materials people use will be "rich textbooks" not just ebooks, but packaged content with videos, quizzes, and "interactive content" so just publishing to .epub or .pdf won’t constitute a work around. Imagine the scenario where you want to include this M.I.A. video in your course content about police state violence, and racial profiling. (YouTube already removed this video, so it is not to far fetched too imagine Apple would deem it too violent.) But take this even a step further, beyond "bookish" content, there is a range of material that I would want to make available to my class which Apple might chose to ban (and I am not even talking about the illegal stuff here). Consider, I have (and probably will continue to) teach Super Columbine Massacre RPG! Clearly this is content someone might find "obscene" or "defamatory" how do I know what Apple’s judgement on this is going to be? Is this really a decision I want to turn over to Apple? Indeed by allowing a locked down device into the classroom, especially if one makes it the center piece of a technology in the classroom movement this is precisely what will happen. Apple will have control over what type of content students can place on these devices. I realize, as many pointed out on Twitter, that this is a decision many school boards already make, censoring course material, believe me I live in Texas, I get it. But there is something substantially different about a community deciding what is and is not appropriate for its students, and a corporation making these decisions. And, for higher ed, where we are not subject to the same school board politics, this would certainly be accepting a larger set of restrictions than we are used to. Again having one corporation serve as a media hub for both software, hardware, and now content, strikes me as a future we ought to resist. (I need a "Just Say No to iPad in Education" banner.)
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:27am</span>
The following is a summary of my talk, or more accurately, the short written version of my talk, "Burn the Boats," which I gave a little over a month ago at the DWRL in Austin. You can read the post, or skip to the end and watch the videos (which last about 40 minutes) and give the longer form of the argument. Earlier this year Marc Andreesen was interviewed by TechCrunch on the future of publishing,in particular journalism. Andreesen’s response was, provocatively, "Burn the Boats." What he was referring to was the moment Cortez, fleeing from Cuba, and landing in Mexico, ordered his troops to "burn the boats," preventing any possibility of return. The lesson: don’t defend lost ground, at times there is no going back, and making decisions to insure that one does not consider a return is a good move. Andreesen’s point was that old print based media forms are dead, and it does no good to try and re-envision them for the 21st century, rather journalism institutions need to boldly move to future web based models, giving up on their print based biases. In a similar regard I would like to suggest that academics "Burn Their Boats" or in this case, more specifically "Burn the Books," making a definitive move to embrace new modes of scholarships enabled by web based communication, rather than attempting to port old models into the new register. Rather than providing the book with a digital facelift for 21st century scholarly communications, academics should move past book based biases which structure scholarly communications and instead imagine and execute born digital scholarly forms, which leverage the evolving digital media landscape. Let me be clear, I like books, in fact it was my love of books, or more specifically my investment in what books can accomplish that led me to graduate school. My PhD is in English after all. Indeed I collect book, and although I don’t do it much anymore I have at times spent time tracking down and acquiring first editions for some of my favorite works. I am not in fact suggesting that we actually engage in book burning, nothing of the sort, although if I did actually burn some of my books I think it would make moving easier. Instead I am suggesting that we burn our love affair with books, and that out of reverence to the book, we stop treating it as the only or indeed primary means of scholarly communication. Not only are there better ways, but if academia wants to remain (or more skeptically, become) relevant we ought to recognize that the book is no longer the main mode of knowledge transmission. Faced with the transformation to the digital, the newspaper industry chose to protect a business model, instead of preserving their social function. My fear is that academics are making the same mistake. Now granted this analogy is not perfect, there are contours and shapes, and nuance and details that matters here. They are not a direct equivalence, but I think the underlying logic is the same. It concerns me that academics and intellectuals, with some exceptions, seem to be repeating this mistake, following the digital facelift model, asking how they can continue to do what they do now, but do it in the digital space, rather than asking how what they do has been fundamentally changed in the age of the digital networked archive. Administrators have a tendency to preserve the business function (how can we offer our classes online vs. how does the online reshape the very idea of a class), and academics end up defending the political and ideological function (the importance of books and peer review). It is probably worth distinguishing here between the materiality of the book, and the ideologies and biases we associate with the book. That is at the most basic level a book is a dead tree processed and bound together in leaves of paper and stained with ink. But, many of the things that we have come to associate with the book are not in fact coterminous with its material structure but rather biases developed over the Gutenberg Parenthesis. I won’t fully develop this idea here but this is what I often call librocentricism, or a book biased way of thinking, where the book stands in for certain prejudices and ideas about knowledge. As a way of thinking about this notice how the word book often stands in for, or comes to mean, the entirety of the matter, as in The Book of Nature, to "throw the book at someone," or The Book of Love. Again there is a lot more to this idea, and I would no doubt need more than a blog post to develop this, but I think it is easy to recognize, even if the full complexity of the argument would take time, that "book" comes to be an epistemological framework for knowledge, not just a material one. One quick example of how this works, before I move to some ideas for restructuring scholarship: syllabi. A syllabus is often structured like a book, a beginning, a middle and an end, indeed even with chapters (sections), where the traversal (completion of the weeks or reading of all the pages), promises to deliver the knowledge product. The idea that knowledge is a product, which can be delivered in an analog vehicle is precisely what I want to call into question. What the network shows us, is that many of our views of information were/are based on librocentric biases. If you printed out all the information on the net, roughly 500 billion GB it would stack from here to Pluto 10 times. While the book treats information as something scarce, the net shows us precisely the opposite, information is anything but scarce. Books tell us that one learns by acquiring information, something which is purchased and traded as a commodity, consumed and mastered, but the net shows us that knowledge is actually about navigating, creating, participating (to be sure some people still trade in knowledge, buying and selling secrets, but this is of a substantially different order than the work we as academics do, especially humanities based academics). Knowledge is no longer print based, nor governed by the substrate of paper, indeed while in many ways we might continue to harbor librocentric biases, as we move away from structuring knowledge to end up on paper, these framing structures will prove less and less necessary, indeed may actually impede on our ability to participate in knowledge conversations. I am not saying that we should whole sale give up on books, actually perform a book burning freeing ourselves from all of the pages we have in our respective offices, but rather something slightly different, we should start conceiving of our scholarship as if if will not end up in books, indeed it still might, but begin by asking ourselves what would scholarship look like if were not designed to end up in books. Here are some ideas, or suggestions for this change over: Stop Publishing in Closed Systems: If I can only convince you of one thing, I hope it will be this. If you publish in a journal which charges for access, you are not published, you are private-ed. To publish means to make public, if something is locked down behind a firewall where someone needs a subscription to view it, it is not part of the "common knowledge" base and thus might as well not exist. Academic journals are treating knowledge as if it is a scarce commodity, it is not, don’t let them treat it as such. If someone wants to publish something you wrote, ask them if you can keep the copyright, license under creative commons, and if they say no, don’t give it to them, and find someone who will. Look for journals which publish only online and only for free. Self Publish: Publishing and editing are hacks based on the scarcity of paper, no need to carry it over to the new medium. Once publishing was the most efficient way to reach the largest audience, no longer is that the case, so lets get over our publishing fetish. Publishing online allows you to engage a wider audience, both faster and more efficiently than any print based journal. We think of an academics role as presenting polished finished work and ideas, but this need not be the case. We should switch to presenting our ideas in process, showing our work, not just the final product. End the .pdf madness: A .pdf document is not a web based document, it is a print based document distributed on the web. One of the principle advantages of the web is the way it connects, operates as a network of connections within an ecosystem of knowledge, one can search, copy, paste, edit, link with ease, none of which is true of a .pdf. The .pdf is just a way of maintaining print based aesthetics and structures on the web. In the same way you wouldn’t think of publishing a book without the appropriate footnotes, don’t publish to the web without the appropriate live links. Get Over Peer Review: Peer Review is another hack based on the scarcity of paper. Given the cost of producing knowledge and the fact that academic journals or academic presses could only afford to produce so many pages with each journal, peers are established to vet, and signal that a particular piece is credible and more worthy than the others. This is the filter than publish model. But the net actually works in reverse, publish then filter, involving a wider range of people in the discursive production. Some of the most productive feedback I receive on my work comes not from peers who have a rather narrow sense of what counts and what doesn’t but from a wider range of people, with a diverse perspective. Why do academics argue for small panel anonymous peer review? One thing we know is diversity of perspective enriches discourse.  Aspire to Be a Curator: I think we have to give up being authorities, controlling our discourse, seeing ourselves as experts who poses bodies of knowledge over which we have mastery. Instead I think we have to start thinking of what we do as participating in a conversation, and ongoing process of knowledge formation. What if we thought of academics as curators, or janitors, people who keep things up to date, clean, host, point, aggregate knowledge rather than just those who are responsible for producing new stuff. Do we really need another book arguing that throughout the history of literary scholarship the important field of ‘x’ has long been ignored. No. But, we could actually use some really good online resources and aggregators for particular knowledge areas. Think Beyond the Book: Think of the book as one form, not the form. Indeed think of things that move beyond the book. What if you are writing didn’t have to be stable, didn’t have to have a final version, what if you could constantly update, change alter, make available your work. There will be no final copy, just the most recent version. While the constantly in beta mode might concern those who aim for perfection, it can also be liberating when you realize that nothing is fixed, taking advantage of the fluid. What happens when we give up on, or at least refuse to be limited by librocentricism? What if a piece didn’t have to be 20 pages for a journal article or 250 for a book, there are economic constraints that place limits on the size of academic writing, how much better can we be when we get rid of these. Or what would an academic argument as an iPhone app look like? Let me be clear, I am not saying that the book is dead, in one regard it is already dead, in another it continues to haunt us and will never die. And we should be glad for this haunting there are many features of the book from which we benefit. What I am saying though is the centrality of the book is gone, and academia would do well to recognize this, to move into new directions, new grounds, where many already are. We should not continue to constrain our thinking by this librocentricism which no longer structures or limits the way that knowledge is produced, archived, or disseminated. (P.S. Below is a photo I took at my visit to The Chronicle last week. Apparently these are all the books they received from academic publishers in the last week (that’s right just one week), which nobody wanted. In other words at an academic institution like The Chronicle, not one reader could be found for any of these books. They were giving them away for free. Seriously, we should stop this madness. Won’t somebody please think of the trees?) (Below is the full video where I elaborate on the points/ideas above.)   Burn the Boats/Books, part 1 from DWRL on Vimeo. Burn the Boats/Books, part 2 from DWRL on Vimeo.
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:26am</span>
Harrisburg University seems to be getting a small amount of press lately for announcing that it would as an experiment block all social media websites for a week (Inside Higher Education Article, Chronicle Article). Facebook, Twitter, MySpace, even AIM and chat features on Moodle will be unavailable on the University network (or more precisely the campus will block the IP addresses of most social networking services, and turn off these features on its own software). In general I think it can be a productive activity to encourage students to take a step back from relying on social media. I say this not because I think social media is a bad, or even harmful technology, but rather because I think that changing behavior can lead students to certain realizations about whatever it is they are studying. Showing students is usually a better pedagogical method then telling them. I won’t go into all the reasons in detail here, if you want you can check out the longer article I wrote for Flowtv.org on the student saturated media environment, but in short I would say that what seems strange and unfamiliar to us, is normal to most of our students. That is there is nothing particularly strange or unusual to them about Facebook, texting, Twitter, YouTube etc. As an educator one particularly effective tactic, I think, is to take the familiar and make it look strange. Or as Siva Vaidhyanathan explained on Twitter recently, students are like fish swimming in an ocean of media, my job is to get them to notice the water. So it might seem like I would support Eric Darr, the provost of Harrisburg, and his plan to cut off social media for a week. Except I don’t. Actually I think it is a bad idea (maybe with good intentions, but a bad idea nonetheless). Let me explain. In short I think this sort of experiment needs to be done carefully at a local level not globally with a broad brush. As Eric Stoller characterized the decision, having the Provost decide the matter for the whole University seems a bit "heavy handed" (Note: the "heavy handed" quote which is attributed to me in the Chronicle article originates with Eric, although I agree with it.) In this instance it becomes an abstracted authority telling his subordinates, what is and is not healthy, or at the least creating an experiment where the participants have no say in the matter. Whether or not it is Eric’s intent the message easily becomes "students cannot live without social media, they should try it for a week." And again whether or not this is the Provost’s intent, it ends up coming off like a "kid’s these days" situation. Try substituting another "batch" of technology to see how problematic this becomes. For a substantial portion of the faculty, dissertations were written on a typewriter maybe we should ban all computers for a week and make graduate students work on typewriters, or we used to communicate in handwritten letters, for a week all communication must be handwritten, or people used to walk everywhere before there were cars, maybe we should have students practice a car free week. This is not to suggest that anyone of the above couldn’t be a productive project, but I think they would only be productive given the right context. If you were studying urban planning it might be useful to have students not use cars for a week, or if you were studying linguistics and machine technology maybe only letter writing would be appropriate, but without a context I think the experiment is bound to fail, probably creating more frustration and anger than anything else. In essence Harrisburg (or Eric, it’s difficult to tell) has grouped together a wide range of technologies and banned them all, without really recognizing their difference, and recognizing the differences between these technologies is one of the crucial things we should be teaching. On the first level who decides what is "social media" and what is not, is foursquare blocked? what about last.fm? World of Warcraft? or discussion boards? or heck even blogs with comments? I am not sure that I could decide what is and what is not social media and I am supposed to be an expert in it, how is a school going to decide? Second on the practical level it is near impossible to block all social media sites. Even if you could create a working definition of social media it would be impossible to create an exhaustive list of sites, there are simply too many to count. Furthermore, how does one even go about enforcing this? A University wide ban is not likely to stop students from using social media, rather what it is likely to do is teach students how to set-up proxies and route around the IP blocking the University is planning on doing (not that this wouldn’t in and of itself be a good thing for students to learn. I wonder how many Tor downloads will happen that week?) Or students will likely just go off campus to access the net, making the ban an inconvenience but not an experience in giving up social media. What is more is that it is likely to disproportionately effect students over faculty and disproportionately  effect some students more than others. Faculty members who go home at night, or students who live off campus will be less affected. And what is worse is there is likely to be a class divide here as students who can afford to work at places like coffee shops will access the net there, or students who can afford Smart Phones will just rely on those devices for social networking. There is one other concern here worth noting, one that I tried to raise in The Chronicle article but which unfortunately came across probably too soft. I think we should start by recognizing that social media isn’t an online form of communication, rather social media is how students communicate. In other words Eric isn’t asking students to give up communicating online, he is asking them to give up a large portion of the way in which they communicate. Imagine if the experiment was to have no one on campus talk to each other? There are actually fairly serious concerns here that shouldn’t be treaded over lightly. For many students their social media networks of friends are crucial to their daily lives, whether as the primary means by which they stay in touch with people or at the most significant level as a medium by which they connect with their support groups. Asking students to give up social media is not just a technical ask, it is a social and psychological one as well, one which I think those who don’t use it as a primary means of communicating probably underestimate. But it is all to easy to critique without offering a solution. So, here is my solution, how I go about asking students to go on a social media fast. I do it within a specific class context, making it an assignment. Since I teach social media, media is both the object and means of study, any ask I make is within the context of the class. In the same way asking students to give up cars for an urban planning class would make sense, asking students to give up a particular social media site within the context of class makes sense. This also presents the opportunity to discuss and process the experience. Create buy in. Just telling students to live without social media seems to authoritarian, explaining to them, again within the context of the class is a far more effective way to handle the situation. If students are bought in to the assignment then they are more likely to do it. An assignment like this cannot possibly be monitored, so you need students to want to willfully do it. Do all my students follow through? No, but a majority do. (Incidentally the person who commented on The Chronicle that I would leave it up to a class vote, sort of missed this point. You can demand a lot of things from students, the one thing you can’t demand is that they learn. Their mindset going into any assignment will greatly determine what they get out of it.) Make the assignment after, or during studying the object. This again creates context. After discussing Facebook and the way students use it, asking them to give it up for a week will make more sense. Pick specific social media, not all social media. When I assign students to give up Facebook for a week they are still free to use email, discussion boards, even Twitter. By being specific you get students to pay attention to the specifics of each site rather than treating them all as equal, which they are clearly not. I might have students give up search engines for a day next semester. Have a specific timeline and a reason for the duration. Make it a challenge. Recognize that students will be differently affected by this assignment, especially if you are asking them to give up their support networks. Join them. I never ask students to give up something that I am not also willing to give up. Have them write about it, during and after. I want them to process the experience, they learn more this way and learn more from each other this way. P.S. You should also read Eric Stoller’s take on this from a student life perspective.  
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:25am</span>
My most recent pedagogical obsession is not, as you might think, social media fasts, but rather working out ways to effectively create group projects. Honestly I consider this one of my serious shortcomings as a professor. I really as of yet have not created a group project with which both the students and I were happy with the results. Something always goes wrong. This is not to say that there haven’t been good ones (and some total misfires) but I have yet to really figure out the best way to do it. Part of my problem comes from not having this modeled for me in graduate school (we in the humanities are more accustomed to working solo) coupled with my own few past experiences as a student, in which I greatly dislike working in groups. But beyond that I think it is a substantial problem with both the way institutions are designed and with student expectations. It is hard to evaluate students individually (what the institution requires) yet try to hold the whole group accountable. And I struggle with this, because I want to encourage and evaluate students for who they are, but on the other hand I see as part of my job to teach students how to work in groups. I think most of the kinds of work environments they are likely to end up in will require working in groups, and internet projects do to their complexity require groups. So here is what I am trying this semester for my EMAC 4325, Privacy, Surveillance, and Control on the Internet . . . The focus of the class is on semester long research projects where each group has a public website/blog covering one aspect of the class. So for the whole semester groups have to work together to produce their project. The project is designed to require a range of skills, design, writing, coding, image manipulation, video and audio editing etc. I came up with two basic rules for this project: Everybody in the group gets the same project grade (which is 50% of the final grade). If you are unhappy with a member of your group, i.e. feel that they are not sufficiently contributing, you can fire them from your team. I put together these two rules from different projects I saw others do, although neither project put them together. On the first day of class I explained these rules and then handed them out the long detailed sheet which contained all the information on the project. Part of the project, indeed the first thing they had to do was come with community rules which described how the group was going to function, what initial responsibilites would be, and finally what the means by which they could dismiss a member of the group would be. In other words they had to write a group constitution of sorts complete with reasons and methods by which they would dismiss someone. (I did explain that in every case a meeting with me would be necessary, but I did this mainly as a way to make sure the group rules were followed, if a group decides to remove someone then I plan to support them.) If someone is removed from a group then they become a group of one, responsible for their own project (which frankly is quite a bit of work). Do I think this will solve all of the group assignment problems? No. But I think this probably represents more realistically how groups function outside of academia, they succeed or fail as a group, it doesn’t really matter if you work really hard, harder than anyone else around, you still need the group (ask Lebron James about this). By focusing on the group I won’t get caught trying to figure out team dynamics and what went wrong, assigning blame (like restaurant wars on Top Chef), instead everyone succeeds, or everyone fails. Simple . . . hopefully. The next thing I did was get them divided into groups. This was actually the most difficult part of the class, so far. I wanted students to be able to have a say in what group they joined, so that they were working on a topic that interested them, but I also wanted to avoid people just pairing up with people whom they have worked before and are friends. I also wanted to make sure that each group got a diversity of talent. I contemplated having them pick teams (schoolyard style) but thought that would end up being a bit ridiculous and isolating to the people who were not picked. Instead I had each student write on a one side of a notecard their name, on the other side they wrote the three topics that interested them the most, and then the three skills they would bring to the project, creating anonymous mini-resumes. I then selected one person for each group, and subsequently that person got to pick from the notecards one person for their team. On the whole this worked out, everyone got in a group that interested them, and the talent in every group is pretty diverse, and groups were picked based on talent not prior relationships or popularity. Overall, three weeks into the semester, I am happy with how the groups are progressing. I have started to give them weekly feedback, always directed at the group rather than individuals. You can see the complete details of the project at the class website, along with links to all the ongoing projects. I’ll write about this again at the end of the semester . . .  
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:24am</span>
As many of you know, The University at Albany (the place from which I received my PhD) has decided to close its French, Italian, and Russian departments. There are a range of reasons that make this an uninformed decision; for a rundown see Rosemary Feal’s The World Beyond Reach. More entertainingly, though, Jean-Luc Nancy, Professor of Philosophy at the University of Strasbourg and the European Graduate School, has written a rather snarky critique of Albany that pretty much sums up what is at stake here. Since it is short I have reposted the entire response (with permission): Choisir entre supprimer le français et supprimer la philosophie… Quel beau choix ! Enlever plutôt le foie ou le poumon ? Plutôt l’estomac ou le coeur ? Plutôt les yeux ou les oreilles ? Il faudrait inventer un enseignement strictement monolingue d’une part - car tout peut être traduit en anglais, n’est-ce pas ? - et strictement dépourvu de toute interrogation (par exemple sur ce qu’implique la "traduction" en général et en particulier de telle langue à telle autre). Une seule langue débarrassée des parasites de la réflexion serait une belle matière universitaire, lisse, harmonieuse, aisée à soumettre aux contrôles d’acquisition. Il faut donhc proposer de supprimer l’un et l’autre, le français et la philosophie. Et tout ce qui pourrait s’en approcher, comme le latin ou la psychanalyse, l’italien, l’espagnol ou la théorie littéraire, le russe ou l’histoire. Peut-être serait-il judicieux d’introduire à la place, et de manière obligatoire, quelques langages informatiques (comme java) et aussi le chinois commercial et le hindi technologique, du moins avant que ces langues soient complètement transcrites en anglais. A moins que n’arrive l’inverse. De toutes façons, enseignons ce qui s’affiche sur nos panneaux publicitaires et sur les moniteurs des places boursières. Rien d’autre ! Courage, camarades, un monde nouveau va naître ! Jean-Luc Nancy, professeur émérite d’une vieille Université française (pas pour longtemps). What’s that you say? You can’t read it because it’s in French? Well luckily for you, despite the best efforts of the Albany adminstration there are French Studies Faculty left. So, you can read it in translation: To choose between eliminating French or Philosophy . . . what a fabulous choice! Should one rather take out the liver or the lung? The stomach or the heart? The eyes or ears? We need to invent teaching that is, on the one hand, strictly monolingual - for isn’t it true that everything can be translated into English? - and strictly lacking in all forms of questioning (for example concerning what is implied by "translation" in general and from one language to another in particular). A single language unencumbered by the static [parasites] of reflection would be a great subject for university study, smooth, harmonious, easily submitting to the controls of acquisition. We should propose eliminating both of them, French and Philosophy. And everything existing in proximity to them, like Latin or psychoanalysis, Italian, Spanish or literary theory, Russian or History. Perhaps it would be wise to introduce in their place, as requirements, certain computer languages (like Java), as well as commerical Chinese and technological Hindi, at least until such languages are able to be completely transcribed into English. Unless the inverse were to happen first. In any case, let’s teach what is displayed on our advertising billboards and on the stock exchange monitors. That and nothing else! Courage, comrades, a new world is about to be born! Jean-Luc Nancy, Emeritus Professor of an old French (not for long) university. Translation by Professor of French and English David Wills (fair disclosure David was my dissertation director).    
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:23am</span>
Sorry folks not much here as of late. That is because I have been working on another project. At any rate for those who are interested on Monday at noon east coast time, I will be participating in a webinar on teaching Writing as Information Arts (sort of a way of thinking about teaching digital literacy).
David Parry   .   Blog   .   <span class='date ' tip=''><i class='icon-time'></i>&nbsp;Jun 09, 2016 06:23am</span>
Displaying 2191 - 2200 of 43689 total records
No Resources were found.