« June 2004 | Main | August 2004 »

July 29, 2004

Perspective(s)

Thanks, all, for the kind comments and emails re the new homepage design. The longer I look at it, the happier I am. I think it's pretty well set--I've added credits, and links both to email and a slightly dated version of my cv. Good enough for now, I think.

I'm gearing up for a dissertation defense later today, and took a break from poring over chapters to try and bring my aggregator under control. There's a couple of interesting pieces I wanted to point to. danah boyd takes the NYT to task for demeaning the unprecedented amount of blogging going on at the DNC this week. What's most interesting about this is that she was asked to expand on this entry, and to turn it into an essay for Salon. In it, she shifts gears quite a bit, and frames a pair portion of the essay in terms of the ideals of objectivity versus the virtues of multiple perspectives. It's an old, old debate, and one that's getting fresh legs as the mainstream media responds to blogging.

The essay is interesting in and of itself, but I recommend it also to those who plan on using (or already are using) blogs in their writing classrooms. Pairing these two essays might provide a really nice example of what it means to move from blog to essay, or simply to move from one audience or space to another. I was struck, for example, by the move from the first passage here to the second:

By framing bloggers as diarists, the NYTimes is demanding that the reader see blogs as petty, childish and self-absorbed.
In order to signify the difference between blogging and "real journalism," it is not that surprising that the New York Times drudges up connotations of 13-year-old girls writing about their lives. It helps to belittle the role of convention bloggers who have been given the same press credentials as reporters.

Neither of these is the "correct" or "better" one for me; each is effective given its context, and helps to point out some of the differences between those contexts. For the record, I'm guessing that "drudges" is simply a misspelling of "dredge," and not a subtle dig at the Drudge Report, but who knows?

The other pointer I have is to David Weinberger's site, where he likewise tackles the question of objectivity. In this case, he examines coverage from the Boston Globe, and discusses how the "necessity" of devising headlines and leads interferes with journalists' ability to be objective. It's a really nice reflection on the gap between convention and coverage, conceived in terms of the rhetorical demands of what are two very different media (speech v. news story).

And if I may be a disciplinary homer for a moment, what's most refreshing about DW's piece is that he uses the word "rhetoric" correctly. Yeah, he cites Heidegger too, but that's just gravy by the time I get to it. heh.

July 28, 2004

bad habits

a piece of my new homepage

How I know I am a geek: Whenever things start piling up, and I don't seem to have any spare time, somehow, I manage to free up about three hours for the most purposeless activities. Case in point: tonight/this morning, despite having more than enough to do, I decided to blow about three hours redesigning my homepage. And before you ask, no, I don't think I could be any more of a dork.

As you'll notice if you visit, the inspiration is drawn directly from Scott McCloud, whose Understanding Comics, Reinventing Comics, and 24 Hour Comics are all sitting on the shelf beside me. I may not have his talent, but at least I've got decent taste. The self-portraits were all generated at Abi Station, which I'll credit on the page itself soon enough. Also credit-worthy is Blambot, for the fonts that I futzed with in putting the page together.


iPods rule (all your other appliances)

Steven Johnson was saying, just a couple of weeks ago:

Because what I need now in my iPod is not more storage space, or Mini-style color designs -- what I need is wi-fi. I want my iPod to double as an audio remote control when I'm sitting in my living room. I want to be able to call up any song on any computer in home network, and direct it to any set of speakers, right from the iPod scrollwheel.

I don't know that it's gotten to that point just yet, but the folks at engadget (a link caught at boingboing) now have a how-to column about turning your iPod into a universal infrared remote control. Pretty speedy service, and damn cool, to boot.

remotepod.jpg

July 25, 2004

The Network Fallacy?

Bopping around this morning, and came across Stanley Fish's latest column in the Chronicle. In it, he cites the conclusion that Mark Taylor arrives at in The Moment of Complexity:

Either argument -- the one that begins, no longer is it possible to maintain the divide [between the academy and society], or the one that begins, there never was a divide in the first place -- leads Taylor to the same conclusions: Let's stop pretending that we can operate in a splendid (but fictional) isolation from everything that enables us; let's accept the fact that we are in, and of, the market and "find new ways to turn market forces to [our] own advantage"; let's prepare "students for life and work changing at warp speed"; let's go beyond the kind of critical analysis that does little more than "promote organizations and institutions whose obsolescence is undeniable"; let's adapt to the real conditions of our existence and eschew "a politics that is merely academic," a politics that is "as sterile as theories that are not put into practice."

As you might imagine, Fish disagrees on a couple of different grounds. The one that I was most intrigued by was what he calls the "system or network mistake":

the argument, more than implicit in Taylor's pages and in the pages of many other theorists of our condition, makes what I would call the "system" or "network" mistake -- the mistake of thinking that because something is embedded in a network that sustains that thing and gives it both value and shape, it is incoherent to speak of its properties, or of the boundaries that separate and distinguish it from other nodal points in the network. Since identity is network-dependent, the reasoning goes, nothing can be spoken of and examined as if it were free standing and discrete.

The trouble with that reasoning is that it operates at a level of generality so high that you can't see the trees for the forest.

Well, yes and no. This is not a new "mistake"--it's been around at least since the heyday of poststructuralism (and it would be easy to trace back through Burke and IA Richards as well). There, it was used as a reductio ad absurdum with which to point out the problem with deconstruction and the like--if it's all "free play of signifiers," then nothing means anything, and we might as well give up, blah, blah, blah. Basically, it involves ignoring one half of KB's "paradox of substance."

Calling it a "network mistake" doesn't quite work for me, because it ignores the degree to which network theory toggles among nodes, links, flows, patterns, et al. And I only buy system as a name for it if we're working with that term circa Jacques Ellul, and Taylor most definitely is not. In fact, as far as I can tell, the only way Fish could have arrived at this characterization is by only reading the final chapter. Maybe I'm wrong about that, but even though it's been a couple of years since I read it, I know that Taylor's discussion of complex adaptive systems is more nuanced than the false dichotomy of forest/tree being offered here.

The "argument" as Fish lays it out sounds no better if the terms are reversed:

the mistake of thinking that because something has boundaries that separate and distinguish it from other nodal points in the network, it is incoherent to speak of the network that sustains that thing and gives it both value and shape. Since nodal points are free standing and discrete, the reasoning goes, nothing can be spoken of and examined as if it were network-dependent.

Network theory, as partially as I may understand it, poises itself between these false alternatives. And from that perspective, it's entirely plausible for Taylor to argue that we need to reconsider the cultural, political, and social flows that connect us to various other points "outside" the university. There are places where I really disagree with Taylor's proposed solutions (many of which are a result of the spectacularly miscalculated keynote he delivered at C&W a few years back), but I also respect the fact that he proposes solutions, and would prefer to see them engaged at that level. Fish has never tired of the strategy whereby he pulls out rugs at a logically prior point, both invalidating the conclusions and removing any need to engage with them. It's certainly a fun tactic to watch, but it rings a bit hollow when it's applied to a writer who's as careful and as skilled as Taylor is. Taylor, quite frankly, deserves better.

The latter half of the column goes on to engage in a longer running crusade of Fish's, the place of morality (or politics or diversity) in academic institutions. While there may indeed be curricular implications to Taylor's position, it differs from the issue of MAC (morality across the curriculum) in ways that Fish doesn't seem to acknowledge. In his February column, he explains that:

The left may have won the curricular battle, but the right won the public-relations war.

While the two are certainly related to one another, Fish has no trouble conceptually separating them in February, and honestly, it's not that tough to see that Taylor's advocating that the public-relations war be reopened. Taylor himself may fall afoul of Fish's arguments re curriculum, but to suggest that this is all Taylor is talking about is to neglect the very distinction with which Fish opens that Feb column. And the result is a July column that paints Taylor in a pretty unflattering fashion, which he almost certainly does not deserve.

And here I thought all I wanted to do was to scold Fish for using the word network without my permission...;-)

July 24, 2004

I must confess my curiosity

Will notes that Microsoft has set up an RSS feed for topics of interest to educators. I admit it, I'm curious and, as of a few minutes ago, subscribed.

The first entry I looked at connects to a white paper at the MS site: Learning in a Connected World. Even downloaded it. But let me make one thing clear to the person from Microsoft who comes here as a result of the obligatory corporate egosurf. When the page told me that the paper explores, and I quote:

A solution architecture to help educators map a technology infrastructure to the challenges faced by institutions today.

I can't tell you how close I came to unsubscribing right there. Laugh all you like at the tortured prose of academic writers, poststructural theorists, etc., but this is abominable stuff. An architecture, and a solution architecture no less, to help me map an infrastructure to challenges? OMG.

At least, the page doesn't use "leverage" as a verb--oh wait, that's in the subtitle.

I know that it's basically an extended advertisement for MS products, and I know that every network generates its own jargon which is more or less opaque, but still. Look at it this way: if academic prose is so bad as to function almost like a second language, why in the world would we want to learn a third language in order to be advertisted at? Hire a couple more Discursive Efficacy Engineers, would you?

(I'm available on a consultant basis)

July 23, 2004

Will Blog for Cash

Things are getting a little crazy when I'm posting twice in a week, I know. The manuscript is still proceeding at a steady pace, and that gives me permission to stay even with my Bloglines subscriptions. And to notice the following convergence of posts:

Got all that? I don't want to try and reproduce, bc it's worth your time not only to read each of those posts, but especially the comments (49 last time I checked) at leuschke.org. There's a convergence here, perhaps of my own devising, of folk wondering about the relationship between their blog personae and the "real" world, and there are lots of answers represented among these various sites.

I've thought about this more than I normally would, bc Lori and I had a conversation about what she should do--and as she notes, there's a choice going on there between being "principled" (if they don't like who I am, then I wouldn't want to work there) and "pragmatic" (lots of us work places and among people, quite successfully, without being liked by all, but it doesn't make sense to wreck your chances before you've even started). My own advice was pragmatic--and my own approach to this space is similarly pragmatic. I try not to say here what I'd be unwilling or uncomfortable saying among my colleagues, students, etc., but mainly this is a place for me to read, write, and think in ways that have been largely tangential to what I do on a daily basis as an employee of the university.

What struck me upon reading the comments at Leuschke.org was how my position shifted. I tend to agree with Steve (and Graham) about anonymity, but one thing that the comments really brought to my attention was that this position has a lot to do with the fact that I'm relatively comfortable. I don't have tenure, no, but I'm finishing up my first book, get pretty good teaching evaluations, contribute to the department in a range of ways, and I believe that my colleagues are quite pleased at having hired me. I'm also a big, white man, who hasn't had to worry about unwanted attention, who is comfortable screening the material that appears here, and who doesn't really have to worry about the kind of surveillance that some of the comments discussed. In other words, there's a certain amount of privilege involved with the fact that I can write as myself here, without much fear of official reprisal or risk.

That being said, there were also some comments from folk who worried that the perception at home institutions would be "if s/he's blogging, s/he's not doing scholarship, and we can't have that," and to those people, I'd love to forward Stuart's post, and to transpose it into academic terms. To a certain degree, Stephen Bainbridge already has, reprinting an email he received from a friend of his who's a dean at Villanova (this was back in January). Among a variety of interesting points that his friend makes:

Blogging or, more precisely, interaction among bloggers and their readers, strikes me as something very useful to people doing more conventional scholarship. Most realize, I think that scholarship is not done in a vacuum, and that the ability to test one's ideas, and to get ideas from others, would help in writing articles and books. Blogging helps with all that tremendously and in novel ways. In fact, I'm advising my junior colleagues to start following the blogs in their fields, and to think about contributing where appropriate.

I'm sure that academia will lag behind industry in this (as in so much else), but it'd be nice to start seeing some of the people who have been worrying at the importance of according equal weight to electronic scholarship spend their time working blogs into that equation as well. And/or changing that equation to include the kind of work that's being done well outside of the restricted economy of peer review. There's also some thinking here to be done about the relative transparency of blogs (compared for instance to the 24-hour, one-way transparency of email as it's often used by students) and how they overlap with other academic organizations/networks.

July 20, 2004

iLust

the new iPodI know that I should be saving my nickels and dimes for the trip this fall, but I couldn't help myself. Really. I've got one of the first-generation iPods, purchased the first week that Apple was rolling them out. That rollout happened to coincide with the technology budget that I'd received as a new hire at Syracuse, and of course, the ability to use my iPod as a portable drive allowed me to add it to my wish list.

And I've slowly watched the damn things improve, to the point where a lot of the accessories are simply incompatible with my pokey little 5 gig, first-gen pod. Heck, I've stopped downloading iPod software upgrades, for fear that I'd mess it up somehow.

And so, I placed my order this afternoon for a new "click wheel," 4th gen model, along with car adapter, extra dock, etc. The works. Couldn't help myself. And as I was poking around at sites checking out accessories, I came across the following article about an NEC project, called P-ism (P as in Pen). Oh. My. God. Here's one of the promo photos:

pen-based computing
The design concept uses five different pens to make a computer. One pen is a CPU, another a camera, one creates a virtual keyboard, another projects the visual output and thus the display and another a communicator (a phone). All five pens can rest in a holding block which recharges the batteries and holds the mass storage. Each pen communicates wireless, possibly Bluetooth.

Let me say that again. Oh. My. God. Of course, the prototype cost about $30K to put together, and realizability is sketchy on a couple of the pens, but damn. It's enough to make me go out and buy a pocket protector.

July 16, 2004

The 1st Annual Bridget Moynihan Film Festival

At least, it would have been, had I gotten a hold of Coyote Ugly and Serendipity and watched them. As it stands, I did happen across The Recruit a couple of nights ago, and more to the point, I caught a matinee of I, Robot this afternoon.

I must admit that I didn't expect the movie to be as good as it was. Not great or anything, certainly, but better than I thought it'd be. And I'll happily admit that I expected it to be either a horrible abuse of Asimov or yet another Hollywood installment of "are we in charge of our machines or...[dramatic pause]...are they in charge of us?!" There's a little of the latter going on here, certainly, and I'm sure I'll see some Asimov purists slapping at the movie, but by and large, I don't think I'll regret seeing it. Reminds me of the pleasant surprise I had from Enemy of the State--both movies slide in a little bit of scifilosophy in the guise of an action movie. There'll be some who wanted more action and others who wanted more scifi, but for a movie that tries to do some of both, this was pretty good.

Anyhow, I found out afterwards why I was pleasantly surprised--I went in without knowing that Alex Proyas directed this. He's the guy who did Dark City and The Crow, both of which are a little underrated as scifi films, I think. There were some nice touches that brought the movie above average for me, and it was less surprising when I saw his name. Proyas doesn't pass on opportunities to tell story through setting and scene, and scifi films are some of the best examples of this--think Blade Runner and more recently Minority Report. There's a little of that going on here, specifically in the tension b/w old and new that comes across on a number of the street scenes. Some of that tension is ham-handed--Spooner's obsession with a "vintage" pair of Chuck Taylors drove me up a wall, but thankfully, that was minimal. An especially nice touch comes when Moynihan, who's Smith's technophile mirror, is bewildered by Smith's CD player, which isn't voice-activated.

And the play between Smith and Moynihan isn't bad. There's one point where we get beaten over the head with the fact that the two of them are mirror images (him as unthinkingly technophobic, her as unthinkingly technophilic), but the two of them together also end up serving as foils for Sonny, the robot who triggers the whole plot. It's not as subtle as it could have been--Moynihan's character is a little too intentionally stiff (i.e., robotic) in the beginning--but it's also not as heavily played as it could have been. The plot works like a much more nuanced version of Paycheck in that there's a trail to be followed, but it unfolds more naturally than did the trail in that movie. The only complaint I had in that regard is that we don't need to see a copy of Hansel and Gretel in the lab (!!!) to realize that there's a trail to be followed.

I got a vibe off of Moynihan that reminded me a little of watching Sandra Bullock in Demolition Man, and I wonder if Moynihan won't break through in the same way. This is really her first genuine co-star role--each of the movies mentioned above positioned her as furniture more or less, and here, she actually has a chance to be a person who changes over the course of the movie.

The acting's not bad, the plot is better than average, and the CG is actually pretty stellar. I could see the conflict that Proyas may have felt between telling the story and "making the blockbuster," and I'd say that he did a pretty good job sneaking enough of the former in there to make the movie worth seeing. I doubt it'll be top-5 for the summer, but it's not a bad movie if you've got an afternoon...I'd make it a solid matinee.

July 11, 2004

The 1st Annual Clive Owen Film Festival

Okay, so not really.

Nevertheless, I did manage to catch Clive in perhaps his three best-known movies this weekend. First, I went to a morning (which was for me about midnight) showing of King Arthur, and had the entire theatre to myself. Second, I was buzzing around the dial that evening, and found that they're hyping the Bourne Supremacy by showing Bourne Identity nearly every hour on the hour. Finally, I found a copy of Croupier in the bargain DVD bin, and snatched it up.

Croupier is a decent movie--not stellar or anything, but I remember seeing it on the big screen way back when. One of the things that makes Croupier is Owen's emotional distance as an actor. It's almost certainly a Brit thing, but his character in Croupier is a writer who ends up getting back into the casino life and writing about it. He's in it, but he's also watching himself in it, and the struggle between those two selves (Jack and "Jake") propels the movie. Owen's understated performance makes that work pretty well.

Unfortunately, his acting ability hasn't really changed that much in the 5 or so years since, which makes him an odd cast in the role of Arthur. His charisma in the movie is almost intellectual or philosophical--he's an idealist half-Roman, half-Briton who follows a particular philosopher and commands a band of Salmatian knights. He and the rest of the knights are gritty enough, I suppose, but there's not a great deal of drama in the movie. Stellan Skarsgard plays the Saxon chief who's Arthur's primary nemesis (other than the decadent Roman bishop), and his acting is understated as well. He's the chief of this huge Saxon horde, but he comes off more cynical than dangerous.

Hmm. It's no accident, I suspect, that if you check the poster for the movie, the "main" character is Keira Knightley as a leather bikini-clad warrior princess. They don't do much more than hint at the love triangle, and they kill Lancelot before there's a chance for it to develop, but then development isn't really a strength of this movie. Even the climactic scene, the wedding of Arthur and Guinevere, doesn't really feel like it's been earned. And that's true of most of the movie, which gestures towards historical accuracy by opening with some mention of archaeological finds, downplaying all of the magic, grounding it in a specific time period, etc. But what happens is that the story loses a tremendous amount of its juice as a result. I didn't have to sit there like I did with Troy and bracket off all of the obvious Americanisms in the treatment of the story. But that's not to say that there weren't any. Arthur is clearly drawn as proto-American, with decidedly unhistorical beliefs about the fallacy of the Church, the equality of all humanity, etc.

Ugh. I'm talking myself into a lower opinion of the movie than I originally thought I held. I paid matinee price for it, but I'm not sure it was worth that. This is probably one to wait for, maybe a rental...

July 8, 2004

bloglines

As a couple of others have observed already, Bloglines has been up and running now for a year. In addition to a new site design that adds a few nice touches, they've added a service called clipping or clip blogs:

And we're really excited to introduce our biggest new feature:

Bloglines Clip Blogs

  • The easiest way to create a blog
  • Fully integrated with all your Bloglines news feeds
  • One-click blogging from any Web page
  • Subscribe to friends' Clip Blogs and get notified of updates
  • Simply click on the 'My Blog' tab to set up your Bloglines Clip Blog
  • Best of all, your Clip Blog is completely free -- just like the rest of Bloglines!

In addition, Saved Items have been renamed to Clippings, and you can easily move private, clipped items to your public blog and back again.

Interesting stuff. As Will implies, Bloglines has taken a pretty big step in the Furl direction with this, and I think it's a smart one. I like also how they're enabling various social features in their service--there was a point where I was thinking about switching over to Shrook, but Bloglines is keeping me loyal...

July 2, 2004

practice v research

Just a couple of quick notes, while I'm on a brief break from the manuscript...

Nancy White posts over at M2M about the relationship between practice and research when it comes to social software. She's riffing on some comments of danah's which in turn responded to some of Liz's remarks in her blog research post there. It's been interesting to me to watch the M2M conversation unfold in parallel with Liz's conversation with Elijah, who posts the emails at his blog, and continues the conversation at M2M. Got all that? Heh. It's probably easier to just read M2M than to try and follow my winding attempt at a summary here...

Anyhow, Nancy's post (and a remark she made in the comment section to another) got me thinking a bit. She casts this discussion in terms of distance, arguing (rightly, I think) that we need to give up some critical distance and actually use/practice/experience different media before we can speak with any kind of authority or credibility about it. And even then, extrapolating from that experience to speak about the medium itself is a pretty dicey proposition. (i've got a riff on this very issue in the manuscript...)

This reminds me alot of a conversation I engage my students in whenever I teach research writing. We talk about whether it's easier to work with a topic you know well or one you don't. At first glance, I think it's easier to work from what you know, but then that ignores the bad teachers out there (I've had a few, and tried not to be one), people who know their material cold but don't know how to communicate it to people without that same experience. Sometimes, it's more effective to work with an unfamiliar subject, that the questions you ask as a researcher are likely to be the same questions your audience will have, you don't take specialized language for granted, etc.

Nancy's comment to the conversation, about providing a bridge between practitioners and academics, made me think about this, bc I think there are really two bridges there. As someone who's both practitioner and academic, I find myself trying not to take academic habits of thought for granted when I post here, but I also find myself doing the same with practitioner habits when I talk to other academics about what I'm doing. As Alex has talked about, those of us techademics who blog are the tip of a big, slow academic iceberg.

Speaking for myself, of course, I feel oddly suspended between these two audiences, bridging sometimes more in one direction, sometimes more in the other. Like Nancy, I've got little patience for "my toys are better than your toys" kinds of contests, but as someone whose "official" writing is often for an audience that hasn't seen any of the toys before, I can't dismiss those kinds of discussions out of hand. That being said, there are right ways and wrong ways to hold them, of course.

July 1, 2004

Cal-tipping

Time to tip the calendar, even though I don't have a great deal to talk about today. I'm deep into a chapter on invention, and I'll be looking for a couple of readers over the next few days. I'm hoping to fini it up quickly so I can turn full attention to a couple of other chapters that have been more intermittent.

One thing that I've noticed lately is that I need more than 24 hours in a day. I don't mean that I'm too busy--I mean it quite literally. I've found that I generally prefer about 8-9 hours of sleep, but also that my ideal up-time tends to be around 18 hours or so. This means, in the absence of scheduled events, that my bedtime drifts by about an hour or so every couple of days. Currently, I'm going to bed around 11 am, and waking up at about 7-8 pm. Next week, I expect I'll have drifted to early afternoon bedtimes. By midJuly, my schedule may very well be synced with the rest of the "normal" world.

My conclusion is simple. I need the earth to rotate about 8% more slowly, so as to create 26-hour days. That'd be perfect for me. I don't really feel like I need to do more in a day, but it would make it easier for me to match my internal rhythms to the external ones.

If someone could get on that for me, that'd be super.