Home | Background | Respondents | Discussion | Submit Question | Comments | |
A Survey of
Unanswered Questions in CALL: Results & Discussion This section includes selected parts of the survey results and some preliminary discussion of the findings. I plan to write a more detailed paper and submit it for publication within the next few months. If you have questions requiring additional data I might have in the meantime, email me: phubbard@stanford.edu. I reviewed the research questions without considering the information in the optional background/rationale or methodology/comments sections. I identified four top-level categories: design-centered issues, effectiveness issues, learner-centered issues, and research issues. There were twenty-four narrow categories distributed among these. (A full list of the narrow categories can be viewed on the website by clicking on the top-level categories). I also wrote a descriptor (10-word maximum) for each submission to make it easier for website users to browse for items of interest to them. The top-level categories included the following numbers:
The total is 97 here due to the compound questions received from some respondents. The following narrow categories had the largest numbers of responses
One of the research questions that motivated this study was what trends could be identified in surveying CALL professionals. Looking at these last two sets of statistics, it is interesting that questions of effectiveness still tend to dominate. In fact, the basic questions of "Is CALL effective?" and "Is it more effective than alternatives?" remain popular even among those who have been centrally involved in the field for an extended period of time. Also interesting is the fact that learner-centered issues, particularly those involving learner variables, were identified as important. This takes us back to the earliest days of CALL when the promise of individualization was a commonly mentioned strength of CALL over alternatives. A second research question addressed by this study had to do with identifying differences among those who identified themselves as primarily researchers, practitioners, and developers. There were no particularly earth-shaking results here, but the following patterns were observed (where D=those identifying themselves as primarily developers, P=those identifying themselves as primarily practitioners, and R=those identifying themselves as primarily researchers)
In interpreting these results, it is important to recall that the sample was not random, the response rate was just 53%, and the main survey question did not ask respondents to identify "the most important" unanswered research question but simply "ONE important research question in CALL that you would like to see studied." The trends identified here are therefore empirically weak but may still provide some insight into the current state of mind of a large group within the field. Coupled with comprehensive reviews of current research such as those in Levy (2002), it is possible to get a clearer impression of the hazy state of CALL research than the editorializing of individual authors can offer. This concludes the survey report. If you have not already done so, take a look at the survey contributions. Feel free to offer constructive comments on the site as a whole or on individual submissions, and if you have an important CALL research question you would like answered, I encourage you to submit it. Reference Phil Hubbard |
|