Blogs
"Innovation Begins Here" free ebook from @briansolis http://t.co/GkWITVqkLa Excellent! HT @sai_iowa #iaedfuture #plaea
Tags:
iaedfuture
plaea
Jim Gates
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 07:03am</span>
|
I'd like to hear from educators thinking about starting a business. http://t.co/cYFI6qJoPn #ukedchat #cpchat #edadmin #edchat #educhat
Tags:
ukedchat
cpchat
edadmin
edchat
educhat
Jim Gates
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 07:03am</span>
|
Must read --> Star Wars, The Cool Kids, & Ridicule by @michellek107 #bcedchat #think35 https://t.co/A5HW9h7SGU
Tags:
bcedchat
think35
Jim Gates
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 07:03am</span>
|
Spoiler Alert… https://t.co/eHsdAyGyQl
Jim Gates
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 07:02am</span>
|
Math Magic. NEW VIDEO --> https://t.co/X8u8Jwhp6N
Jim Gates
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 07:02am</span>
|
Annotations:
and higher technology
Jim Gates
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 07:02am</span>
|
Yesterday @SybilV posted a comment via Twitter during a library orientation for her class:
An innocent enough of a gesture one could assume. What I took Sybil’s point to be, was that Britannica is not a good scholarly source, and that the library should be encouraging other/more appropriate research practices (like, you know using scholarly sources, and judging credibility and bias). But what also struck me about this was the odd moment when librarians are encouraging students to use the encyclopedia as a source. And, perhaps I read too much into this, but I think the librarians gesture comes as a correction to Wikipedia, i.e. the subtext here is "Don’t use Wikipedia use Britannica." This might be my bias, or my way of reading things, so fair enough I didn’t respond to Sybil’s tweet. But, apparently Britannica has a Twitter account, and the person who manages the account noticed Sybil’s tweet and decided to respond:
Shocked to see that Britannica was on Twitter I couldn’t resist and posted the following:
Well needless to say it was all downhill (or shits and giggles depending on your perspective) from there. I won’t recount the blow, by blow, mainly cause it gets really long, and the person who Tweets from @Britannica obviously feels passionate about defending Britannica, and at one point posted nine straight tweets defending the appropriateness of Britannica as a scholarly source.
A few notes might be worth making at this point: 1. I am not speaking for @SybilV here, these are my opinions, and I have a sense that my tone if not also my stance is more radical/ contentious than hers. 2. I have no idea if the account @Britannica is an official Britannica Twitter account. I looked at the Britannica page and couldn’t find it listed. So, the account might just be a Britannica fan, or an employee who unofficially Tweets from that account. I don’t know, but I think we can take the arguments that @Britannica makes as indicative of those who champion this encyclopedia and its format.
It seems to me that with all the tweets sent back and forth, with others in the Twitterverse adding to the discussion, the central issue was "What is the appropriate use/role for Britannica in relation to society and specifically academia?"
So here’s the thing: 1. It has none. 2. This is because of Wikipedia.
Don’t get me wrong I am not disparaging Britannica, not really. It had a role, and generally speaking it served it well, but:
Yes, Britannica is a pretty good secondary source. It has a lot of advantages as a secondary source. Articles are fairly thorough, contain citations, and are more or less accurate, but as a secondary source it doesn’t even come close to the value of something like Wikipedia. Thirty years ago, heck even ten years ago, Britannica was arguably the best secondary source around. If you wanted to get a quick overview of a specific subject Britannica was a good place to start, a good portal to gaining deep knowledge about a subject.
In a world of dead-tree based knowledge the central authority, hierarchically controlled way of organizing, was a good thing. When you only have so many pages, you can’t reprint frequently, and distribution is expensive, these are good decisions. But in a digital networked information structure these are not.
What you want from a secondary source is a good introduction to a concept, that is mostly reliable, up-to-date, entries for as many topics as possible, connections to where to go to learn more, and easy and ubiquitous (as possible) access. A secondary source is not an in depth analysis which upon reading one is suddenly an expert on said entry or topic, it’s not designed to be. It is just a good overview. No secondary source is going to be completely accurate, or engage in the level of detail and nuance which we want from students, or that is required to fully "know" about a subject.
This is why the Wikipedia banning by schools and professors has always struck me as a particularly stupid policy. The issue is not that Wikipedia is or is not reliable and thus should be banned in academic environments, rather the issue is that Wikipedia is a secondary source and thus should not be treated as a primary one. But, this also holds true for Britannica. Any syllabus which contains language about banning Wikipedia misses this point. Ban secondary sources from student work, not Wikipedia in particular as this confuses the issue. This doesn’t mean that students shouldn’t use secondary sources, indeed they should they are great ways to begin to learn about a subject. It just means they should not cite secondary sources, they should always look for primary ones, and that they should never take Wikipedia or Britannica as the final word on a subject. I don’t recall a single syllabus from my college days (pre-Wikipedia) that said "do not use Britannica as a source for your papers, doing so will result in failing the assignment." Seriously, professors explained to us what reference books were for, and how to correctly use them.
Several semesters ago I wrote a piece defending Wikipedia and arguing that it was irresponsible to not teach students about how to use Wikipedia. I won’t rehash those arguments here, but I will reference one objection made in the comments of this article, which I often hear when I talk about Wikipedia:
MY guess is that the author wouldn’t want his doctor to base his latest surgery on a Wikipedia article.
Of course not, don’t be stupid, I wouldn’t want my doctor to be educated by Wikipedia, but I wouldn’t want my doctor to be educated by Britannica either. The role of Wikipedia isn’t to train heart surgeons how to perform a bypass, nor is it the role of Britannica, that is not the function of these objects. To hold Wikipedia to this standard is more than a bit ridiculous. Wikipedia doesn’t strive to be an object that teaches doctors how to operate (although it seems that Britannica might be trying to claim this ground).
We could argue about the accuracy of Wikipedia, although studies show that it is as accurate as Britannica, or about the policy that "any one can edit," at least with Wikipedia I can view the editing history, or we could argue about the problems on Wikipedia, of which there are many (bland prose, serious debates between inclusionists and deletionist, its Western-English bias, an increasing bureaucratic control structure, among others). But what really isn’t arguable at this point is that as a broad overview of knowledge, a good place to start an inquiry, Wikipedia is a killer app.
When it comes to functioning as a secondary source, a reference guide, Wikipedia has substantial advantages over any prior encyclopedia model. In the same way that Britannica’s model of "get experts in a field to write specific articles" was a vast improvement over the prior model "get the smartest person to write the whole encyclopedia," Wikipedia is a substantial improvement over Britannica. (Sorry folks at Britannica, this is just the way it is. P.S. While you are at it you might want to sell your stock in 8-tracks, newspapers, and scriptoriums.) The breadth of knowledge, its ability to be linked to other knowledge, its cost (free), its up-to-dateness, and its preservation of editorial discussions (it records not only the article but the discussion which produced said article) makes it far more useful. And that doesn’t even begin to address things like how much easier Wikipedia is to use for mash-ups and data extraction, repurposing the information for other reference works.
To illustrate this point I make the following challenge:
I hereby challenge any employee of Britannica to a game of trivial pursuit. You can consult Britannica Online for any question, and I can consult Wikipedia. Want to take bets on who will win? (I’ll even let you have all 15 print editions as well). We could also play "Who Want’s to Be a Millionaire?" of "Jeopardy" if you want.
So, this is the bind that Britannica is caught in. It can market itself as a secondary source: we are a great reference tool. But if it does this, someone can easily point out that Wikipedia is a better secondary source, and free (in other words libraries can spend dwindling resources on other primary materials). Or, it can claim to be a great primary source, a role it simply can’t fulfill. It simply doesn’t have a place anymore, there are better services doing what it did.
Now seriously, can we end this debate already. Instead lets talk to students about how appropriately to use secondary sources, how to understand how encyclopedias function, how all encyclopedias are biased, all knowledge is discursive, and focus on teaching students how to judge credibility and accuracy instead of outsourcing it to people at Britannica.
David Parry
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 06:35am</span>
|
Scholastic recently chose to censor a book (or more accurately ask an author to alter her book-which is a form of censorship) because one of the characters, wait for it . . . has, gasp, "two mommies." The School Library Journal has the background story. Normally I would stop at passing this article around and encouraging people to act, signing petitions, sending complaints, etc. But, a reader sent me the link to Scholastic’s own blog On Our Minds. And, more importantly pointed out that somehow this blog makes a list of minds they "admire" (aka their blog roll). Since they "admire" my mind I thought I would give them a piece of it.
Dear Scholastic,
It was recently pointed out to me that this blog, Academhack, is contained in a list on your website under the heading, "Minds We Admire." Since you admire my mind I thought I would take the opportunity to share with you what this mind thinks, particularly in response to your recent "UPDATE on Luv Ya Bunches".
I think writing a 300 word blog post attempting to explain your position, while never once addressing the fact that you asked the author to "clean up the book" removing the reference to same-sex parents amounts to tacit admission that the company asked Lauren (the author) to alter her work. In other words tacit support of homophobia.
I think that defending yourself by pointing to other books you publish with gay and lesbian characters is a bit like saying "some of my best friends are gay."
I think if I worked for your company I would quit.
I think that publishing the book is far different from actively promoting it at your fairs. This is like a "don’t ask don’t tell policy" where you accept difference so long as it doesn’t become too inconvenient.
I think that you are making a business decision, not an ethical one, and I think you should just be honest about this.
I think that you are trying to avoid complaints by conservative narrow minded homophobic people at the expense of presenting and promoting diversity.
I think if I were a children’s author I wouldn’t let you sell my book until you reversed your policy.
I think that censoring diverse voices is a sure fire way to propagate intolerance. I think that the communities from which you most fear backlash are perhaps the ones who most need to see this book displayed.
I think bowing to financial pressure over doing what is right is the sure way to end up on the wrong side of history.
I think that you are a business which will make financial choices, but I also think that schools and parents can chose which businesses they want to support.
I think if I were a primary school teacher I would refuse to pass out your catalog to my students.
I think I will buy several copies of Luv Ya Bunches and give them out to kids I know. I can think of no better way to combat your homophobia than by encouraging kids to read books which tell diverse stories. (That and one of your competitors will get my money, sending you a financial message as well.)
David Parry
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 06:34am</span>
|
Jut got back from MLA, much writing, blogging, and reflecting to follow, but in the meantime I seem to have overlooked mentioning that an article I wrote for Flow.TV was published (published is this the right word for it in the age of the internet?) last month. For those who are interested I make the case that New Media is Neither.
More Later.
David Parry
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 06:34am</span>
|
Two Things about the MLA conference I want to connect here:
1. Clearly one of the themes that has developed in the MLA post-mortem has been the rise of social media and the influence of technology at the conference. Both The Chronicle and Inside Higher Ed noticed the prominence of Twitter at the convention, or the apparent prominence of Twitter. It seemed that unlike last year where the majority of conversation about/on Twitter and the MLA was confined to to one session, this year, although noticeably less than other conferences, Social Media was clearly playing a role.
What is more, as The Chronicle noticed this seemed to be a part of a larger trend in the Digital Humanities. Ultimately I agree with Mark Sample (@samplereality), who posted an analysis of the MLA Tweets and Matt Kirschenbaum(@mkirschenbaum) who argued via Twitter that this meme/theme was some what overstated. As Matt observed, the MLA has a history of at least being marginally receptive to "technology and literacy" panels even if they have not been placed in the center of the discourse. (Rosemeary Feal deserves mad props for her outreach here. Tweeters aren’t always the most reverent or polite bunch, self included, but I am nothing compared to @mladeconvention.) Given that Matt won the MLA book award for best first book, it is hard to ignore the fact that digital humanities is becoming more prominent and more mainstream, if still marginal. But I also think there is somewhat of an echo chamber effect here. That is, of course those who write online and are engaged with technology are more likely to notice that technology is being talked about. I think if we polled all of the attendees at the MLA a vast majority of them would have no idea that a conversation (at times academic, at times not) was taking place via Twitter. Indeed I would venture to guess that a majority could not really describe to you what Twitter is/was.
2. One of the other "much talked about items" at MLA was Brian Croxall’s (@briancroxall’s) paper, or non paper titled, "The Absent Presence: Today’s Faculty." I say non-paper because Brian, who is currently on the job market and an adjunct faculty, didn’t attend the MLA, instead he published his paper to his own website. (I am told the paper was also read in absentia.) I won’t recap the whole thing here, you should just go read it. But two things stand out in the article: 1."After all, I’m not a tenure-track faculty member, and the truth of the matter is that I simply cannot afford to come to this year’s MLA." 2. "And yes, that means I do qualify for food stamps while working a full-time job as a professor!"
For several reasons Brian’s paper hit a nerve. Indeed The Chronicle picked up the story, a piece which for a few days was listed as the most popular story on The Chronicle’s website. His paper became, arguably, the most talked about paper of the convention.
In part Brian’s story (how the paper became popular, not the content-or at least not yet, more on that in a minute) is in part a story of the rise of social media, and its influence. And this is where I think the real story in the Digital Humanities is, not the rise of the Digital Humanities, but rather the rise or non-rise of social media as a means of knowledge creation and distribution, and the fact that the rise has changed little. Digital Humanities if it is rising is rising as "Humanities 2.0″ allowed in because it is non-threatening.
So if you imagined asking all of the MLA attendees, not just the social media enabled ones, what papers/talks/panels were influential my guess is that Brian’s might not make the list, or if it did it wouldn’t top the list. That is because most of the "chatter" about the paper was taking place online, not in the space of the MLA.
Let’s be honest, at any given session you are lucky if you get over 50 people, assuming the panel at which the paper was read was well attended maybe 100 people actually heard the paper given. But, the real influence of Brian’s paper can’t be measured this way. The real influence should be measured by how many people read his paper, who didn’t attend the MLA. According to Brian, views to his blog jumped 200-300% in the two days following his post; even being conservative one could guess that over 2000 people performed more than a cursory glance at his paper (the numbers here are fuzzy and hard to track but I certainly think this is in the neighborhood). And Brian tells me that in total since the convention he is probably close to 5,000 views. 5000 people, that is half the size of the convention.
And, so if you asked all academics across the US who were following the MLA (reading The Chronicle, following academic websites and blogs) what the most influential story out of MLA was I think Brian’s would have topped the list, easily. Most academics would perform serious acts of defilement to get a readership in the thousands and Brian got it overnight.
Or, not really. . .Brian built that readership over the last three years.
As Amanda French (@amandafrench) argues, what social media affords us is the opportunity to amplify scholarly communication (actually if your read only one thing today on social media and academia today, read this). As she points out in her analysis (interestingly enough Amanda was not at MLA but still tweeting (conversing) about the MLA during the conference) only 3% of the people at MLA were tweeting about it. Compare that to other conferences, even other academic ones, and this looks rather pathetic. Clearly MLAers have a long way to go in coming to terms with social media as a place for scholarly conversation.
But, what made Brian’s paper so influential/successful is that Brian had already spent a great deal of time building network capital. He was one of the first people I followed on Twitter, was one of the panelists at last years MLA-Twitter panel. He teaches with technology. I know several professors borrow/steal his assignments. (I personally looked at his class wiki when designing my own.) Besides having a substantial traditional CV, Brian has a lot of "street cred" in the digital humanities/social networking/academia world. More than a lot of folks, deservedly so. It isn’t that he just "plays" with all this social media, he actually contributes to the community of scholars who are using it, in ways which are recognized as meaningful and important.
In this regard I couldn’t disagree with BitchPhD more (someone with whom I often agree) in her entry into the MLA, social media, Brian’s paper nexus of forces. Bitch claims that, "Professor Croxall is, if I may, a virtual nobody." Totally not true. Unlike Bitch he is not anonymous, or even pseudo-anonymous, his online identity and "real world identity" are the same. He is far from a virtual nobody. Indeed I would say he is one of the more prominent voices on matters digital and academia. He is clearly a "virtual somebody," and he has made himself a "virtual somebody" by being an active, productive, important, member of the "virtual academic community." If he is anything he is a "real nobody," but a "virtual somebody." In the digital world network capital is the real "coin of the realm," and Brian has a good bit of it, which when mustered and amplified through the network capital of others (@kfitz, @dancohen, @amandafrench, @mkgold, @chutry, @academicdave —all of us tweeted about Brian’s piece) brings him more audience members than he could ever really hoped to get in one room at the MLA.
And so Brian isn’t a virtual nobody, he isn’t a "potential somebody" he is a scholar of the the digital humanities, one that ought to be recognized. But here is the disconnect, Brian has a lot of "coin" in the realm of network capital, but this hasn’t yielded any "coin" in the realm of bricks and mortar institutions. If we were really seeing the rise of the digital humanities someone like Brian wouldn’t be without a job, and the fact that he published his paper online wouldn’t be such an oddity, it would be standard practice. Instead Brian’s move seems all "meta- and performative and shit" when in fact it is what scholars should be doing.
And so in the "I refute it thus" model of argumentation I offer up two observations: 1. The fact that Brian’s making public of his paper was an oddity worth noticing means that we are far away from the rise of the digital humanities. 2. The fact that a prominent digital scholar like Brian doesn’t even get one interview at the MLA means more than the economy is bad, that tenure track jobs are not being offered, but rather that Universities are still valuing the wrong stuff. They are looking for "real somebodies" instead of "virtual somebodies." Something which the digital humanities has the potential of changing (although I remain skeptical).
In the panel at which I presented, an audience member noting the "meme" about the rise of the digital humanities asked if all of this "stuff" about digital humanities just reflected our fascination with gadgets, or how we balance our technology with humanities, how does the digital affect the humanities in a non-gadget way? (I paraphrase but that’s the thrust of the question). After a few of the other panelists answered, I suggested that the question was bad (this is often a rhetorical trope I employ). I said instead of thinking of the word digital as an adjective which modifies the humanities, the humanities 2.0 model, I am more interested in how the digital effects not how we do the humanities, but rather how the digital can fundamentally change what it means to do humanities, how the digital might change the very concept of "the humanities." I don’t want a digital facelift for the humanities, I want the digital to completely change what it means to be a humanities scholar. When this happens then I’ll start arguing that the digital humanities have arrived. Really I couldn’t care less about text visualizations or neat programs which analyze the occurrences of the word "house" in Emily Dickinson’s poetry. If that is your scholarship fine, but it strikes me that that is just doing the same thing with new tools. Give me the "virtual somebodies" who are engaging in a new type of public intellectualism any day. Better yet, if you are a University and want to remain relevant in the next moment, give these people a job.
David Parry
.
Blog
.
<span class='date ' tip=''><i class='icon-time'></i> Jun 09, 2016 06:33am</span>
|